I want to find an automated monitoring service link Pingdom and NewRelic that will track the total user-perceived page load time and analyze it (ala http://code.google.com/speed/page-speed/docs/rules_intro.html)
I already have pingdom for absolute external page request time and NewRelic for all kinds of internal performance metrics.
I want to measure the actual time between a request and the user being able to use the page, as measured by Firebug, YSlow, etc (another example here: http://tools.pingdom.com/).
I am looking for a fully automated service with metric reporting. I can measure this manually myself a variety of ways, but that's just the beginning.
Any advice?
For local page speed testing PhantomJS is extremely useful. Phantom is a headless webbrowser--it runs a real browser without a UI and provides a solid programmatic interface. For performance testing Wesley Hale's loadreport.js is fantastic. I highly recommend using it during local development as well as for CI testing.
EDIT (New Product): NewRelic added really impressive page-load tracking with their latest release a few weeks ago. If you're already using it for server-side monitoring, it's easy to enable. It injects a tracking JS script onto all requests that measures the client-side of the request.
It's got great graphing, mates directly to your server-side data, and measures your actual users (not a sample of servers all over the globe). So you can see how things are actually impacting the requests on your site vs. a hypothetical bench-mark.
This is the solution we're using in production now.
Original Answer: It looks like http://www.yottaa.com/ implements exactly what I am looking for.
Incase if you want to do it yourself, then you can checkout two libraries by which you can measure page load performances.
If you are looking for turn-key solutions, then you can try out Atatus which helps you to measure page load time. Also it supports AJAX performance monitoring and transactions monitoring.
https://www.atatus.com
Disclaimer: Developer at Atatus
It seems yotta, like pingdom and others, use "scripts" to test the website, not real browsers?
In my opinion (as web developer), user perceived page load time = load time in a real web browser! For example, Javascript can slow down the page load time significantly (or even trigger an error), but you will never notice this unless you test with a real browser. If you use Flash or Flex, the situation is even worse. Without a browser, the applet will never be started.
Keynote Systems and AlertFox are offering such real brower monitoring services. The later also has a free plan (see link below the main table): http://www.alertfox.com/plans
I know this is an older question but I just came across it and it still seems relevant, if not more so than ever before:
There are now in-browser dev tools for auditing and quantifying measures of performance, but not many afaik that present the results over time.
My Recommendation:
SpeedCurve seems to be doing this beautifully, aggregating and charting WebPageTest.org and PageSpeed / Lighthouse Audit snapshots. By the look of their client list they might be in no small part responsible for that corner-turn too.
One major caveat: the question is about user perception, but asks about automated tools for measuring when a page becomes active. Perceived time is often very different from actual time.
If you want the page to feel fast, there are a number of tricks you can use to make things feel faster, such as making sure the most important information appears first, or to add animations which look fast.
If you want page controls to be usable fast, for example for employees who need to fill out forms quickly and frequently, there are different tricks you can use--both for speeding up the page load and to make sure the user knows the controls are ready and can get to them quickly.
No matter what your goals are, actual page speed is a good place to start, but it's not the only thing.
Here's an introduction to the topic: http://blog.teamtreehouse.com/perceived-performance