Automated Monitoring of User-perceived Page Load T

2019-03-15 18:31发布

问题:

I want to find an automated monitoring service link Pingdom and NewRelic that will track the total user-perceived page load time and analyze it (ala http://code.google.com/speed/page-speed/docs/rules_intro.html)

I already have pingdom for absolute external page request time and NewRelic for all kinds of internal performance metrics.

I want to measure the actual time between a request and the user being able to use the page, as measured by Firebug, YSlow, etc (another example here: http://tools.pingdom.com/).

I am looking for a fully automated service with metric reporting. I can measure this manually myself a variety of ways, but that's just the beginning.

Any advice?

回答1:

EDIT (New Product): NewRelic added really impressive page-load tracking with their latest release a few weeks ago. If you're already using it for server-side monitoring, it's easy to enable. It injects a tracking JS script onto all requests that measures the client-side of the request.

It's got great graphing, mates directly to your server-side data, and measures your actual users (not a sample of servers all over the globe). So you can see how things are actually impacting the requests on your site vs. a hypothetical bench-mark.

This is the solution we're using in production now.

Original Answer: It looks like http://www.yottaa.com/ implements exactly what I am looking for.



回答2:

It seems yotta, like pingdom and others, use "scripts" to test the website, not real browsers?

In my opinion (as web developer), user perceived page load time = load time in a real web browser! For example, Javascript can slow down the page load time significantly (or even trigger an error), but you will never notice this unless you test with a real browser. If you use Flash or Flex, the situation is even worse. Without a browser, the applet will never be started.

Keynote Systems and AlertFox are offering such real brower monitoring services. The later also has a free plan (see link below the main table): http://www.alertfox.com/plans



回答3:

Incase if you want to do it yourself, then you can checkout two libraries by which you can measure page load performances.

  • https://github.com/lognormal/boomerang
  • https://github.com/stevesouders/episodes

If you are looking for turn-key solutions, then you can try out Atatus which helps you to measure page load time. Also it supports AJAX performance monitoring and transactions monitoring.

https://www.atatus.com

Disclaimer: Developer at Atatus



回答4:

I know this is an older question but I just came across it and it still seems relevant, if not more so than ever before:

  • The average web page size in 2010 was 702kb compared to in 2016 which is 2232kb1,
  • … around the same as the O.G. Doom install image2, however…

The top sites have turned a corner. While the overall average page size is increasing inexorably, […] the following chart shows the progression of the global average page weight vs. the top ten websites.2

There are now in-browser dev tools for auditing and quantifying measures of performance, but not many afaik that present the results over time.


My Recommendation:

SpeedCurve seems to be doing this beautifully, aggregating and charting WebPageTest.org and PageSpeed / Lighthouse Audit snapshots. By the look of their client list they might be in no small part responsible for that corner-turn too.



"Benchmark your site against your competitors; Track dozens of metrics, including custom metrics; Create performance budgets and get alerts; Mobile device emulation & CPU throttling; Diagnose performance issues at the page level; Analyze the performance impact of ads and third parties; Continuous Deployment & Visual Diffs; Create and share custom charts and dashboards; Tailor dashboards to different audiences"3



  1. The Growth of Web Page Size — KeyCDN
  2. The web is Doom — mobiForge
  3. Synthetic: WebPageTest — SpeedCurve


回答5:

The one listed by Montherun in his comment, Webpagetest.org is pretty good for your requirement.

If you want to test the user experience of your website from clients at different locations in the globe, you can use their RESTful APIs to make HTTP calls. You can set the location, browser type, network speed etc., using HTTP parameters. You can also set the parameters to return a XML response which you can parse in your scripts to produce the necessary metrics.

On the other hand, if you want to test the user performance from select locations under your control, say your remote office PC, you can deploy your own private instance at those locations to provide the same details. It requires your own private server and clients agents installed for each type of tests you want to run.

The documentation in the links should be sufficient to get started.



回答6:

For local page speed testing PhantomJS is extremely useful. Phantom is a headless webbrowser--it runs a real browser without a UI and provides a solid programmatic interface. For performance testing Wesley Hale's loadreport.js is fantastic. I highly recommend using it during local development as well as for CI testing.



回答7:

One major caveat: the question is about user perception, but asks about automated tools for measuring when a page becomes active. Perceived time is often very different from actual time.

If you want the page to feel fast, there are a number of tricks you can use to make things feel faster, such as making sure the most important information appears first, or to add animations which look fast.

If you want page controls to be usable fast, for example for employees who need to fill out forms quickly and frequently, there are different tricks you can use--both for speeding up the page load and to make sure the user knows the controls are ready and can get to them quickly.

No matter what your goals are, actual page speed is a good place to start, but it's not the only thing.

Here's an introduction to the topic: http://blog.teamtreehouse.com/perceived-performance