visit
This article originally appeared on
The Chrome development teams work on features and improvements that make our browsing and developing experience better. The Google I/O 2017 conference took place in April and there are significant tidings. A part of them is about the DevTool, so it affects us as web developers that use chrome. Chrome 60 is coming with many new features and changes in the DevTool. The “WOW” feature is the new Audits panels.
The Audits panels are powered by . Lighthouse provides a comprehensive set of tests for measuring the quality of your web pages. The tests categories are Performance, Accessibility, Best Practices and PWA (Progressive Web Apps).
In this article, we will explore the Audit feature, understand the categories, run it on some popular websites, cover the report results and get a taste from the Lighthouse’s architecture.
The Audits panel is an existing feature in the chrome DevTool for a while. Before version 60 of Chrome, this panel contained only Network Utilization and Web Page Performance measurements. The Audits panel has been replaced with an integrated version of the Lighthouse tool.
The look and feel are so different between the versions. The differences are so essential that Google presents this feature as a new one. There was a way to use Lighthouse in the old versions of chrome as a browser extension or as a , but now it is a built-in feature in the browser.
Audits feature before Chrome 60
The Audits panel is now powered by Lighthouse. Lighthouse was developed by Google. It provides a comprehensive set of tests for measuring the quality of your web pages. Lighthouse is an open-source project.
“Do Better Web” is an initiative within Lighthouse to help web developers modernize their existing web applications. By running a set of tests, developers can discover new web platform APIs, become aware of performance pitfalls, and learn (newer) best practices. In other words, do better on the web! DBW is implemented as a set of standalone and that are run alongside the core Lighthouse tests.
To learn more about how it works and how to contribute to it, check out the Lighthouse talk from Google I/O 2017 below:
Google I/O 2017 talk about Lighthouse
This talk walks through what’s new in Lighthouse and how it’s evolved into a companion for modern web development. In addition, it covers the using of Lighthouse in different environments (Node CLI, Chrome DevTools, WebPageTest and headless Chrome), the architecture, Github/Travis/CI integration, headless Chrome and the ways you can extend Lighthouse by authoring custom audits to run against your own site.
The Audits tab is the last built-in tab in the browser DevTool. In order to use it you need to install the or latest version of Chrome 60.
In order to audit a page you should follow this steps:
The Audits panel with the Lighthouse Logo before performing an audit.
Lighthouse analyzes the page according to 4 categories: Performance, Accessibility, Best Practices and Progressive Web Apps (PWA). Lighthouse runs the page through a series of tests such as different device sizes and network speeds. It also checks for conformance to accessibility guidelines such as color contrast and ARIA best practices.
Audits report result per category
The scores at the top are your aggregate scores for each of those categories. The rest of the report is a breakdown of each of the tests that determined your scores. Each panel focuses on one of the categories and shows the category results in an appropriate structure.
Progressive Web Apps (PWA) are reliable, fast, and engaging, although there are many things that can take a PWA from a baseline to exemplary experience.
To help teams create the best possible experiences, Lighthouse has put together a checklist which breaks down all the things we think it takes to be a Baseline PWA, and how to take that a step further with an Exemplary PWA by providing a more meaningful offline experience, reaching interactive even faster and taking care of many more important details.
PWA results — failed tests part
When we click on the PWA circle in the top bar, the first part we see is the Failed tests list. We can read, explore and then fix the failing tests.
The next parts of the PWA report are the Passed items list and the Manual checks. There are checks that must run manually in order to verify them. Those checks are important, but they don’t affect the score.
PWA report — passed items and the manual checks parts
Web performance refers to the speed in which web pages are downloaded and displayed on the user’s web browser. Web performance optimization is the field of knowledge about increasing web performance.
Faster website download speeds have been shown to increase visitor retention and loyalty and user satisfaction, especially for users with slow internet connections and those on mobile devices.
The first part of the performance category is the Metrics. These metrics encapsulate the app’s performance across a number of dimensions.
Performance's metrics
As you can see there are 3 main points of loading:
The next performance’s part is the Opportunities. These are opportunities to speed up your application by optimizing some resources, like images and text compression.
The Opportunity and Diagnostics parts
The last part is the Diagnostics. These diagnostics show more information about the performance. One of them is the Critical Request Chains, that shows what resources are required for the first render of the page. We can improve page load by reducing the length of chains, reducing the download size of resources or deferring the download of unnecessary resources.
Accessibility refers to the experience of users who might be outside the narrow range of the “typical” user, who might access or interact with things differently than you expect. Specifically, it concerns users who are experiencing some type of impairment or disability — and bear in mind that that experience might be non-physical or temporary.
The accessibility category contains tests to analyze the capability of screen reader and other assistive technologies to work correctly on the page. For example, usage of attributes by elements, ARIA attributes best practices, discernable names of elements and more.
Accessibility category report
The best practices category checks some recommendations for modernizing the page and avoiding performance pitfalls. For example, application cache, HTTPS usage, deprecated APIs, permission requests from the user and more. This part contains failed and passed tests lists.
Best practices category report
In this section, we will see the top scores of 3 popular websites. The first is the landing page of Weather.com. The second is a results page of Google. The last is the wall page in Facebook.
Popular websites scores
We can see that the PWA is the lowest score category, maybe because PWA is a new field in the web. We can see that while the performance of Google is the best, the performance of weather.com is bad (consistently interactive after over 25ms). The accessibility of all tested sites is good with a score greater than 80. Accessibility is a field, that gets a large focus nowadays and recently had made part of the law in some countries.
The lighthouse’s flow contains some main steps. Part of the steps occurred in the browser and the others executed by the lighthouse runner.
Lighthouse architecture
Here are the Lighthouse’s components:
Accessibility and PWA became main measures in the modern web development. Companies invest time and money to improve them in their web pages. The integration of Lighthouse in the dev tool is viable. It will help web developers to be more professionals and to deliver pages in a higher quality. I really sure that we will hang out a lot of time in the Audits tab and after we ran it on some popular websites, not only us.
You can follow me on or to read more about Angular, JavaScript and web development.