4 Improvements To Google Analytics Real-Time Reports

Real-time reports provide useful insights for businesses that help them understand how their systems are reacting, instantly, such as when you send out an email campaign or engage in marketing that has a temporal nature. It provides alerting / intelligence, giving insight into things that are new or different such as a sudden increase in site traffic. Real-time also lets you win social by capitalizing on trending topics. For example, if you noticed a blog post you published previously is suddenly gaining attention due to something happening in the news, you could highlight it on the front page of your site to draw additional attention and ‘pour fuel’ on the social fire. 

Today, we're announcing 4 improvements to real-time reports. You can now:
  1. Analyze Events in real-time
  2. Breakdown real-time by Desktop/Tablet/Mobile traffic
  3. Create shortcuts to your favorite real-time segments
  4. Compare real-time filtered data against overall real-time data
Let’s go through the changes in more detail:

1. Realtime Events Report
With the real-time events report, you can now not only see the top events as they occur but also filter on particular event categories (and actions). Additionally, you can see whether particular segments of visitors trigger different events and debug your events deployment in real time.

To access this report, navigate to the real-time section of Google Analytics and click on the Events section. You should see a report similar to this:


Clicking on any of the Event Category will drill down and show all the Event Actions and Event Labels for that particular category.

If you are trying to see what events a particular segment of visitors generate, that is easy as well. Any filters you set up in any part of real-time are preserved in the Events report. For example, in the above screengrab we have set up a filter here to see what events are triggered from visitors coming via organic search.

2. Content Breakdown by Desktop/Tablet/Mobile
We live in an increasingly multi-screen world, and now you can see in real-time the type of device that visitors are using to visit your web site (desktop, tablet and mobile). This is available in the content report as shown below:


As with real-time reports, you can easily see your visitors filtered by the device type (by clicking on either of “Desktop” “Tablet” “Mobile”).

3. Shortcuts for your important real-time segments
We’ve heard from users that you like to look at certain segments of visitors in real-time, but dislike setting up the filters each time. Now, you can use the “Create Shortcut” feature to store your favorite segments. 


Now all you need to do is open up Shortcuts from the left navigation menu and click to any of your shortcuts.

4. Comparison real-time to overall data
Finally, you can compare the pageviews of your segmented visitors to overall traffic as shown below. This is nifty if you want to see quick comparison trends. For example, many times, after a G+ post, I create a filter by device type of “Mobile” and can see that the mobile traffic picks up much faster and also contributes more to the initial increase in pageviews.


Stay tuned for continued improvements to real-time, a growing area of importance for your digital marketing.

Posted by the Google Analytics Real-Time team

Improving The Activity Stream In Social Reports

We’ve redesigned the Google Analytics Social Reports to make it easier to see the conversations and activity happening surrounding your content on our Social Data Hub Partners. We’re introducing 2 new reports that make it easier to consume this data:
  • Data Hub Activity
  • Trackbacks
The activity stream was previously available via drill down from the Network Referrals or Landing Pages reports. We have now made it a standalone report. By navigating to the Data Hub Activity report you’ll see a timeline of the number of activities that have occurred in the Social Data Hub and the raw activities in a list below. You can also filter this list by any specific networks you choose. 

We’re also excited to announce that Trackbacks are now available in a standalone report. Trackbacks are all of your inbound links across the web, so you’ll be informed if anyone from a small to blog to the New York Times posts a link to your site. Additionally, we are providing context for the significance of each of these trackbacks by displaying the number of visits that were driven by each endorsing URL during the reporting period. You’ll see this number presented alongside the trackback. 
Image via Google’s Analytics Advocate, Justin Cutroni 
Give them a try and happy analyzing!

Posted by Linus Chou, Product Manager, Google Analytics

Expanding Universal Analytics into Public Beta

A typical consumer today uses multiple devices to surf the web and interact in many ways with your business. For most large businesses, already swimming in many sources of data, it’s an enormous challenge, but also an incredible opportunity. 

Back in October, we announced the limited beta release of Universal Analytics as a way for businesses to understand the changing, multi-device customer journey. Today, we’re excited to welcome and invite all Google Analytics customers to try Universal Analytics.


The benefits of using Universal Analytics to businesses are: 
  • Understanding how customers interact with your businesses across many devices and touch-points, 
  • Insights into the performance of your mobile apps
  • Improvements of lead generation and ROI by incorporating offline and online interactions so you can understand which channels drive the best results,
  • Improved latency on your site by reducing client-side demands.
Testimonials from the initial beta release
Our initial beta customers using Universal Analytics and are pleased with their results. Rojeh Avanesian, VP of Marketing at PriceGrabber.com reports:

"At PriceGrabber, we know it’s important to understand consumer shopping behavior so we can provide a more customized experience to our users. Google’s Universal Analytics will solve this problem for us and many sites that are facing this challenge and help us serve our users better by providing them with more relevant content and shopping results. We can use Google Analytics metrics to segment our users in a way that improves and simplifies the shopping experience for consumers. That’s what we strive for at PriceGrabber, to make shopping and saving money as easy as possible."


How to get started using Universal Analytics
If you’re new to Google Analytics, you can choose Universal Analytics when you setup your account. Already using Google Analytics? Create a new web property in your Google Analytics account to set up Universal Analytics and explore the new features. 

Here’s what you’ll see when you create a new web property. Select the Universal Analytics column to get the new analytics.js code snippet you can implement on your website:


You can implement Universal Analytics with the new analytics.js JavaScript for websites, our iOS and Android SDKs for apps, and the new Measurement Protocol for all other platforms. 

Find more details on how to set up using our help center or developer guide. (Migration guides for properties using ga.js coming soon. Until then, set up a new property in your account for Universal Analytics).

To tag in the most flexible way possible, you can also take advantage of the Universal Analytics template available in Google Tag Manager, which allows you to make additional changes and enable new features to your analytics setup without changing the hard-coded tags on your website. Learn more about how to implement Universal Analytics through Google Tag Manager

For more information on Universal Analytics, visit our help center and developer guides

Happy analyzing - in the new and innovative ways you can with Universal Analytics!

Posted by JiaJing Wang, Product Manager, Google Analytics

Enhancing Google Analytics Access Controls

Today we’re excited to announce that enhanced user-access control lists are coming to Google Analytics. Google Analytics users have long been requesting more fine-grained control over access to various parts of their accounts. We listened, and we're delivering that control over the coming weeks.

Recap of the current access control system
Previously, user access was controlled with a role-based system. A user could be either a full account administrator, or a simple report viewer on your profiles.

How the new system will work
First, we’re expanding where permissions can be applied.  We’ll allow permissions to be set not only at the Google Analytics account and profile levels, but also at the property level (learn more about these entities). Second, we’re enhancing the permissions any user can have. Instead of offering only two roles, (administrator & report viewer) we’re now allowing every user to have any combination of view, edit, and manage-users access. You can customize permissions for each user at the account, property, and profile level, providing a greater variety of access than was available with the previous role-based system.

For example, one user can have full access to an entire account, another user can have edit and view access to a single property, and a third user could have view only access to a set of profiles.

Properties inherit permissions set on their parent account, and profiles inherit permissions set on their parent properties. For example, a user with view access to an account, also has view access to all of that account's properties and profiles.

Migration
We will automatically migrate all accounts over the coming weeks. When your account is migrated, you will notice a new, richer user-management interface that looks like the following:

Click for large-sized preview
The migration will convert account admins to users with full (manage users, edit, and view) access to that entire account; report-viewing users in the current role-based system will remain as users with view access to relevant profiles.

Conclusion
By making these changes Google Analytics users will be able to better meet their access-control needs and have an even better analytics and reporting experience. Enjoy the new controls!

Posted by Tim Thelin & Matt Matyas, Google Analytics Team

Understand Updates To Your Account With Change History

Have you ever wanted to learn more about changes made to your Google Analytics account, wanted to refresh your memory as to when a particular profile setting was changed, or wondered who on the team made a goal change? Now, all of that is possible with the launch of Change History.

What it does
Change History presents a summary of many important changes to your account over the last 180 days. Users will find records of changes made to users, accounts, properties, profiles, goals, and filters. This feature is available only to Analytics account administrators.

How to find it
We are rolling out Change History to our customers over the coming weeks. Once it’s enabled on your account, you’ll be able to see it by clicking the “Admin” button in the upper right corner of the Analytics interface, selecting the appropriate account, and clicking the tab labeled “Change History.” In this new section, you will see a list of changes on your account, when the change took place, and who made the change.


Conclusion
The new Change History helps you better understand how your accounts evolve over time, improves account collaboration, and provides an additional tool for optimal configuration.

Be sure and view our help center article for additional information. 

Posted by Scott Ellsworth and Matt Matyas, Google Analytics team

Increasing Your Analytics Productivity With UI Improvements

We’re always working on making Analytics easier for you to use. Since launching the latest version of Google Analytics (v5), we’ve been collecting qualitative and quantitative feedback from our users in order to improve the experience. Below is a summary of the latest updates. Some you may already be using, but all will be available shortly if you’re not seeing them yet. 


Make your dashboards better with new widgets and layout options



Use maps, devices and bar chart widgets in order to create a perfectly tailored dashboard for your audience. Get creative with these and produce, share and export custom dashboards that look exactly how you want with the metrics that matter to you. We have also introduced improvements to customize the layout of your dashboards to better suit individual needs. In addition dashboards now support advanced segments!

Get to your most frequently used reports quicker

You’ll notice we’ve made the sidebar of Google Analytics even more user-friendly, including quick access to your all-important shortcuts:


If you’re not already creating Shortcuts, read more about them and get started today. We have also enabled shortcuts for real-time reports, which allows you to set up a specific region to see its traffic in real-time, for example.

Navigate to recently used reports and profiles quicker with Recent History


Ever browse around Analytics and want to go back to a previous report? Instead of digging for the report, we’ve made it even simpler when you use Recent History.

Improving search functionality



Better Search allows you to search across all reports, shortcuts and dashboards all at once to find what you need.

Keyboard shortcuts

In case you've never seen them, Google Analytics does have some keyboard shortcuts. Be sure you’re using them to move around faster. Here are a few useful ones:

Search: s , / (Access to the quick search list)
Account List: Shift + a (access to the quick account list)
Set date range: d + t (set the date range to today)
On screen guide: Shift + ? (view the complete list of shortcuts)

Easier YoY Date Comparison


The new quick selection option lets you select previous year to prefill date range improving your productivity to conduct year over year analysis.

Export to Excel & Google Docs 

Exporting keeps getting better, and now includes native Excel XSLX support and Google Docs:


We hope you find these improvements useful and always feel free to let us know how we can make Analytics even more usable for you to get the information you need to take action faster.

Posted by Nikhil Roy, Google Analytics Team

Multi-armed Bandit Experiments

This article describes the statistical engine behind Google Analytics Content Experiments. Google Analytics uses a multi-armed bandit approach to managing online experiments. A multi-armed bandit is a type of experiment where:
  • The goal is to find the best or most profitable action
  • The randomization distribution can be updated as the experiment progresses
The name "multi-armed bandit" describes a hypothetical experiment where you face several slot machines ("one-armed bandits") with potentially different expected payouts. You want to find the slot machine with the best payout rate, but you also want to maximize your winnings. The fundamental tension is between "exploiting" arms that have performed well in the past and "exploring" new or seemingly inferior arms in case they might perform even better. There are highly developed mathematical models for managing the bandit problem, which we use in Google Analytics content experiments.

This document starts with some general background on the use of multi-armed bandits in Analytics. Then it presents two examples of simulated experiments run using our multi-armed bandit algorithm. It then address some frequently asked questions, and concludes with an appendix describing technical computational and theoretical details.

Background

How bandits work

Twice per day, we take a fresh look at your experiment to see how each of the variations has performed, and we adjust the fraction of traffic that each variation will receive going forward. A variation that appears to be doing well gets more traffic, and a variation that is clearly underperforming gets less. The adjustments we make are based on a statistical formula (see the appendix if you want details) that considers sample size and performance metrics together, so we can be confident that we’re adjusting for real performance differences and not just random chance. As the experiment progresses, we learn more and more about the relative payoffs, and so do a better job in choosing good variations.

Benefits

Experiments based on multi-armed bandits are typically much more efficient than "classical" A-B experiments based on statistical-hypothesis testing. They’re just as statistically valid, and in many circumstances they can produce answers far more quickly. They’re more efficient because they move traffic towards winning variations gradually, instead of forcing you to wait for a "final answer" at the end of an experiment. They’re faster because samples that would have gone to obviously inferior variations can be assigned to potential winners. The extra data collected on the high-performing variations can help separate the "good" arms from the "best" ones more quickly.
Basically, bandits make experiments more efficient, so you can try more of them. You can also allocate a larger fraction of your traffic to your experiments, because traffic will be automatically steered to better performing pages.

Examples

A simple A/B test

Suppose you’ve got a conversion rate of 4% on your site. You experiment with a new version of the site that actually generates conversions 5% of the time. You don’t know the true conversion rates of course, which is why you’re experimenting, but let’s suppose you’d like your experiment to be able to detect a 5% conversion rate as statistically significant with 95% probability. A standard power calculation1 tells you that you need 22,330 observations (11,165 in each arm) to have a 95% chance of detecting a .04 to .05 shift in conversion rates. Suppose you get 100 visits per day to the experiment, so the experiment will take 223 days to complete. In a standard experiment you wait 223 days, run the hypothesis test, and get your answer.

Now let’s manage the 100 visits each day through the multi-armed bandit. On the first day about 50 visits are assigned to each arm, and we look at the results. We use Bayes' theorem to compute the probability that the variation is better than the original2. One minus this number is the probability that the original is better. Let’s suppose the original got really lucky on the first day, and it appears to have a 70% chance of being superior. Then we assign it 70% of the traffic on the second day, and the variation gets 30%. At the end of the second day we accumulate all the traffic we’ve seen so far (over both days), and recompute the probability that each arm is best. That gives us the serving weights for day 3. We repeat this process until a set of stopping rules has been satisfied (we’ll say more about stopping rules below).

Figure 1 shows a simulation of what can happen with this setup. In it, you can see the serving weights for the original (the black line) and the variation (the red dotted line), essentially alternating back and forth until the variation eventually crosses the line of 95% confidence. (The two percentages must add to 100%, so when one goes up the other goes down). The experiment finished in 66 days, so it saved you 157 days of testing.




Figure 1. A simulation of the optimal arm probabilities for a simple two-armed experiment. These weights give the fraction of the traffic allocated to each arm on each day.

Of course this is just one example. We re-ran the simulation 500 times to see how well the bandit fares in repeated sampling. The distribution of results is shown in Figure 2. On average the test ended 175 days sooner than the classical test based on the power calculation. The average savings was 97.5 conversions.





Figure 2. The distributions of the amount of time saved and the number of conversions saved vs. a classical experiment planned by a power calculation. Assumes an original with 4% CvR and a variation with 5% CvR.

But what about statistical validity? If we’re using less data, doesn’t that mean we’re increasing the error rate? Not really. Out of the 500 experiments shown above, the bandit found the correct arm in 482 of them. That’s 96.4%, which is about the same error rate as the classical test. There were a few experiments where the bandit actually took longer than the power analysis suggested, but only in about 1% of the cases (5 out of 500).

We also ran the opposite experiment, where the original had a 5% success rate and the the variation had 4%. The results were essentially symmetric. Again the bandit found the correct arm 482 times out of 500. The average time saved relative to the classical experiment was 171.8 days, and the average number of conversions saved was 98.7.

Stopping the experiment

By default, we force the bandit to run for at least two weeks. After that, we keep track of two metrics.
The first is the probability that each variation beats the original. If we’re 95% sure that a variation beats the original then Google Analytics declares that a winner has been found. Both the two-week minimum duration and the 95% confidence level can be adjusted by the user.

The second metric that we monitor is is the "potential value remaining in the experiment", which is particularly useful when there are multiple arms. At any point in the experiment there is a "champion" arm believed to be the best. If the experiment ended "now", the champion is the arm you would choose. The "value remaining" in an experiment is the amount of increased conversion rate you could get by switching away from the champion. The whole point of experimenting is to search for this value. If you’re 100% sure that the champion is the best arm, then there is no value remaining in the experiment, and thus no point in experimenting. But if you’re only 70% sure that an arm is optimal, then there is a 30% chance that another arm is better, and we can use Bayes’ rule to work out the distribution of how much better it is. (See the appendix for computational details).

Google Analytics ends the experiment when there’s at least a 95% probability that the value remaining in the experiment is less than 1% of the champion’s conversion rate. That’s a 1% improvement, not a one percentage point improvement. So if the best arm has a conversion rate of 4%, then we end the experiment if the value remaining in the experiment is less than .04 percentage points of CvR.

Ending an experiment based on the potential value remaining is nice because it handles ties well. For example, in an experiment with many arms, it can happen that two or more arms perform about the same, so it does not matter which is chosen. You wouldn’t want to run the experiment until you found the optimal arm (because there are two optimal arms). You just want to run the experiment until you’re sure that switching arms won’t help you very much.

More complex experiments

The multi-armed bandit’s edge over classical experiments increases as the experiments get more complicated. You probably have more than one idea for how to improve your web page, so you probably have more than one variation that you’d like to test. Let’s assume you have 5 variations plus the original. You’re going to do a calculation where you compare the original to the largest variation, so we need to do some sort of adjustment to account for multiple comparisons. The Bonferroni correction is an easy (if somewhat conservative) adjustment, which can be implemented by dividing the significance level of the hypothesis test by the number of arms. Thus we do the standard power calculation with a significance level of .05 / (6 - 1), and find that we need 15,307 observations in each arm of the experiment. With 6 arms that’s a total of 91,842 observations. At 100 visits per day the experiment would have to run for 919 days (over two and a half years). In real life it usually wouldn’t make sense to run an experiment for that long, but we can still do the thought experiment as a simulation.

Now let’s run the 6-arm experiment through the bandit simulator. Again, we will assume an original arm with a 4% conversion rate, and an optimal arm with a 5% conversion rate. The other 4 arms include one suboptimal arm that beats the original with conversion rate of 4.5%, and three inferior arms with rates of 3%, 2%, and 3.5%. Figure 3 shows the distribution of results. The average experiment duration is 88 days (vs. 919 days for the classical experiment), and the average number of saved conversions is 1,173. There is a long tail to the distribution of experiment durations (they don’t always end quickly), but even in the worst cases, running the experiment as a bandit saved over 800 conversions relative to the classical experiment.





Figure 3. Savings from a six-armed experiment, relative to a Bonferroni adjusted power calculation for a classical experiment. The left panel shows the number of days required to end the experiment, with the vertical line showing the time required by the classical power calculation. The right panel shows the number of conversions that were saved by the bandit.

The cost savings are partially attributable to ending the experiment more quickly, and partly attributable to the experiment being less wasteful while it is running. Figure 4 shows the history of the serving weights for all the arms in the first of our 500 simulation runs. There is some early confusion as the bandit sorts out which arms perform well and which do not, but the very poorly performing arms are heavily downweighted very quickly. In this case, the original arm has a "lucky run" to begin the experiment, so it survives longer than some other competing arms. But after about 50 days, things have settled down into a two-horse race between the original and the ultimate winner. Once the other arms are effectively eliminated, the original and the ultimate winner split the 100 observations per day between them. Notice how the bandit is allocating observations efficiently from an economic standpoint (they’re flowing to the arms most likely to give a good return), as well as from a statistical standpoint (they’re flowing to the arms that we most want to learn about).





Figure 4. History of the serving weights for one of the 6-armed experiments.

Figure 5 shows the daily cost of running the multi-armed bandit relative to an "oracle" strategy of always playing arm 2, the optimal arm. (Of course this is unfair because in real life we don’t know which arm is optimal, but it is a useful baseline.) On average, each observation allocated to the original costs us .01 of a conversion, because the conversion rate for the original is .01 less than arm 2. Likewise, each observation allocated to arm 5 (for example) costs us .03 conversions because its conversion rate is .03 less than arm 2. If we multiply the number of observations assigned to each arm by the arm’s cost, and then sum across arms, we get the cost of running the experiment for that day. In the classical experiment, each arm is allocated 100 / 6 visits per day (on average, depending on how partial observations are allocated). It works out that the classical experiment costs us 1.333 conversions each day it is run. The red line in Figure 5 shows the cost to run the bandit each day. As time moves on, the experiment becomes less wasteful and less wasteful as inferior arms are given less weight.





Figure 5. Cost per day of running the bandit experiment. The constant cost per day of running the classical experiment is shown by the horizontal dashed line.

1The R function power.prop.test performed all the power calculations in this article.
2See the appendix if you really want the details of the calculation. You can skip them if you don’t.

Posted by Steven L. Scott, PhD, Sr. Economic Analyst, Google

Tagging just got easier: Built-in templates for popular tags in Google Tag Manager

One of our favorite features of Google Tag Manager is the ability to add new tags to your site using a tag template instead of copying-and-pasting code — and we’ve just made tagging even easier with several new built-in tag templates. Just add a few key details to the template, and Google Tag Manager will automatically generate the correct code.

We’ve teamed up with a variety of companies to provide our first wave of Tag Vendor templates, including:

This is just the first wave of supported tags, and you can look forward to many more coming soon. If you have specific requests, we’d love to hear them in our Google Tag Manager Forum in the Feature Requests section.

If you’re a tag vendor, and you’d like to get your tag supported in Google Tag Manager through the Tag Vendor Program, follow the instructions here to get started. And thanks to all of our partners for your support and involvement with Google Tag Manager!

Getting The Most Out Of Google Analytics For Lead Generation

The following is a guest post from Jeff Sauer, Vice President at Three Deep Marketing, a Google Analytics Certified Partner. Jeff recently started a website dedicated to advancing digital marketing knowledge called Jeffalytics

Lead generators know that the combination of Google AdWords + Google Analytics is a winning combination for generating an inflow of high quality leads. They are like peanut butter and jelly, Forrest Gump and Jennay, Mel Gibson and Danny Glover. 
What many users may not realize is that there are many features that they can unlock in Google Analytics to make their lead generation campaigns perform better while becoming more transparent and accountable. What follows is a series of tips, trips and hacks that you can use to make your lead generation campaigns work even better. I have broken this down into three sections: ConfigurationIntegration, Analysis.

Configuring Analytics for Lead Generation Websites

Set Up Goals in Google Analytics
Yes, this is a very elementary step in your Google Analytics evolution. You surely configured goals on your site years ago, right? Well, let's make sure you didn't miss anything: 
  1. Navigate to the URL of your 'thank you' page shown after a lead is generated. Make note of the URL of this page.
  2. Make your best guess as to the value of each lead that you generate (note: you can have multiple lead values, and multiple goals).
  3. Configure your goals in Google Analytics, assigning the proper goal value for each lead you generate.
  4. Unlock a new world of reports in Google Analytics and see the real value of your lead generation efforts.

Bonus tip: There's absolutely nothing wrong with measuring micro conversions on your lead generation site. Have a PDF that someone can download freely? Set a goal and assign it a modest value (even if it's $5, the impact can be huge). Have a 2 minute video? Give it a value as well, even if it's just a dollar or two. Both PDF downloads and video plays can be tracked using GA event tracking - and you can configure goals around events.  
Track Visitors Across Domains
Many lead generation sites use third party forms and services to capture leads, whether as part of an affiliate program or a third party CRM site. While this acts as an excellent conduit to lead delivery, it can often result in missing data in Google Analytics reports. Depending on the services used, there is still a way to retain this data in Google Analytics by tracking your visitors across domains. Here's how this is done: 
  1. On your primary website, add the _gaq.push(['_setDomainName', 'PRIMARY DOMAIN']); and _gaq.push(['_setAllowLinker', true]); methods.
  2. When linking to your external domain, add an onclick element as follows: onclick="_gaq.push(['_link', 'THE LINK']); where THE LINK is your external page
  3. Add the GA Tracking Code to your third party hosted page, being sure to use the _gaq.push(['_setDomainName', 'PRIMARY DOMAIN']); and _gaq.push(['_setAllowLinker', true]); methods on this page as well. It is important to make sure you are setting your primary domain here as well. 
  4. Configure your goals to match the thank you page URL on the third party domain (or on your own site if you can redirect visitors back to your domain)
By linking visits across domains, your reports will accurately attribute visitors and goals to their proper source and medium instead of treating them as direct visitors.  
Integrate with Google AdWords Both Ways
Most of us know to share data between AdWords and Analytics and enable the Google AdWords report in Analytics, but many times this is not done properly. In addition, not enough marketers seem to take advantage of Google Analytics' ability to push conversion data back into AdWords. You really have nothing to lose when you integrate these two Google products both ways, but you have many insights to gain. Start off by making sure you configure these integrations properly: 
  1. Share Google AdWords data with Google Analytics. This may seem easy, but is often incomplete when implemented. Make sure that you 1) Turn on Auto Tagging in AdWords, 2) Enable Data Sharing and 3) Apply Cost Data into Google Analytics
  2. Configure your goals in Google Analytics as outlined above
  3. As soon as data starts to collect for these goals, you will see the option in AdWords to import your goals from Google Analytics
  4. Enjoy consistent conversion data between both products and ensure that leads are being properly attributed
Using your goals in Google Analytics for your Google AdWords campaigns can come in handy when you don't have the ability to add a traditional JavaScript based conversion code onto your thank you page. In addition, importing goals from Google Analytics allows you to track some of the advanced conversions mentioned below in Google AdWords. The result? Better analysis capabilities, more advanced conversion rate optimization strategy and more credit for the leads you generate! 

Integrating Analytics into Lead Generation Efforts

Phone Call Tracking
One thing that marketers may not realize is that for many industries, the majority of leads will come in through the phone instead of through a web form. Google AdWords understands this and now offers a robust system for tracking phone leads generated by AdWords. But how do you properly track and attribute phone calls generated from your site to a particular traffic source? You integrate Google Analytics with your call tracking provider.


This sounds complicated, but it really is not too bad. In fact, many phone tracking vendors offer a Google Analytics integration option as part of their service. For example, this works well with products like Marchex Voicestar and Mongoose Metrics among others.  
Here are the basics of how this process works: 
  1. Sign up with a phone call tracking service, create tracking numbers and appropriate campaigns
  2. Place tracking phone numbers on your website
  3. Specify a post-back URL to be visited when a successful phone call occurs
  4. Your phone tracking system will send a visit to the post back URL, complete with all Google Analytics cookie values for the visitor who saw that exact tracking number on your lead generation site

Please note that if you drive a lot of traffic to your website, it can take a lot of phone numbers and extensions to fully attribute phone calls to users. As such, you may want to start implementing this method for a small segment of your traffic and then building up to all visitors when this data proves useful. 

Also note that even if you don't link calls back to Google Analytics, phone call tracking is still an imperative part of any lead generation campaign, because it's common for 30-70% of the leads you generate to come from the phone in certain industries. 
Offline Marketing
Believe it or not, in many industries leads are still generated offline. Examples include trade shows, neighborhood canvassing (going door to door promoting a product or service), print and television advertising. These are activities that companies have been doing for years, but the problem that they run into when using these mediums to drive traffic to their website is that they don't register the traffic source properly in Google Analytics. The result: many direct visitors without proper attribution. 


How do we fix this? By following this simple process: 
  1. Create a vanity URL that is unique to your campaign (can be a sub folder or new domain)
  2. Create a tracking URL for your website using the Google Analytics URL Builder 
  3. 301 redirect your vanity URL to the tracking URL (this preserves your campaign attributes)
  4. Learn about how each traffic source performed by viewing your favorite reports in Google Analytics and paying attention to the source/medium/campaign 
Now you can put your offline and online leads on a level playing field and compare the effectiveness of both side by side. 
CRM Integration
For companies that are generating several leads a day, a Customer Relationship Management (CRM) system becomes imperative for keeping up with the leads coming in the door. Unfortunately, most CRM implementations are not integrated fully with the website and useful data is not shared between the two systems. This can create friction between sales and marketing, while making it nearly impossible to close the loop on what lead generation efforts are working the best.

Fortunately, people smarter than myself have found a way to solve this problem, and this solution for CRM integration by Justin Cutroni has become my gold standard for how to pull information out of Google Analytics cookies and attach to the lead record you enter into your CRM system. 

While Justin's post goes into great detail, the basic premise is this: 
  1. A visitor comes to your website and has source/medium/campaign/keyword information assigned to them in their Google Analytics cookie
  2. This information is accessible to your website by pulling cookie values out of Google Analytics using JavaScript
  3. Once this information is pulled out, you enter the values into hidden form fields underneath where your lead enters their contact information
  4. The vital information (source/medium/campaign/keyword term) is passed into your CRM system alongside the lead record
  5. Your sales team can now have deeper understanding of what type of traffic generates the best leads, all the way down to a keyword level
  6. You can use this information to refine your marketing efforts and campaigns to focus on your top performers
Sharing information between your website and your CRM system is an imperative step for making your marketing data actionable to the rest of the business. Without integrating, decisions are made based on faith and HIPPOs, instead of actionable data. As a note, with the advent of Universal Analytics this is likely to get even easier.  

Analyze the Results and Make Your Site Even Better

How you analyze your site is a very personal thing, and your mileage may vary, so there isn't a magic bullet to ongoing success with your lead generation programs.

With that said, there are several reports that can be extremely useful in Google Analytics for lead generation campaigns. I would start by paying attention to the following: 
  • Use an advanced segment of paid search traffic and then navigate to the Conversions > Goals report. Compare the goal values you created recently with a similar time period in the past. Are your results improving? 
  • Navigate to the Multi Channel Funnels report and either use standard or custom channels. What is the most common first click channel? Are you giving it enough credit in your reporting?
  • Compare direct traffic before and after implementing the integrations suggested above. Do you start to see more activity with proper attribution? Are you more confident analyzing with less of a grey area?
  • Have you been receiving all of the credit you deserve for leads you generate over the phone?
  • When a salesperson tells you that the leads you generate "suck" are you able to match their lead close rate to the source/medium/keyword that generated the lead?
  • Instead of presenting raw lead numbers in a vacuum are you starting to factor in appointments issued, quotes given and sales made? Can you calculate the true cost of sale from keyword to purchase?
When configured properly, you can use Google Analytics and residual data from GA to perform some in depth closed loop analysis on how your lead generation campaigns are performing. Savvy lead generation experts have figured out how to deliver maximum value to their clients and constituents using the capabilities built into Google Analytics. Now it's your turn. 
There you have it, the three pillars to getting the most out of Google Analytics for your lead generation website. Have any cool integrations yourself? Let's talk in the comments below.
Jeff Sauer 

Announcing Enhanced Link Attribution for In-Page Analytics

In-Page Analytics provides click-through data in the context of your actual site, and is a highly effective tool to analyze your site pages and come up with actionable information that can be used to optimize your site content.

Before now, In-Page Analytics was limited to showing clickthrough information by URL and not by the actual link on the page, and was limited to showing information only on links, and not on other elements like buttons. The most common complaint about In-Page Analytics is that if a page has two or more links to the same destination page, we show the same statistics for both links, since there was no telling which link the user actually clicked.

We are now introducing a new feature that solves these issues. To use Enhanced Link Attribution, you’ll need to add two lines to your tracking snippet and enable the feature in the web property settings. In-Page Analytics will then use a combination of element IDs and destination URL to provide the following new capabilities:
  • Distinguish between multiple links to the same destination page, attributing the right number of clicks to each link.
  • Provide click-through information even when redirects are being used.
  • Provide click-through information for other elements such as buttons or navigations that were triggered by JavaScript code.
Here’s an example of how Enhanced Link Attribution can improve In-Page Analytics:

Before (without Enhanced Link Attribution):

After (With enhanced link attribution):

Notice the following differences:
  • There’s a very prominent red ‘Add Subtitles’ link (that’s styled to look like a button) that didn’t show any click information before Enhanced Link Attribution, but now shows 20%. This is because clicking this link results in a server redirect to one of multiple pages, none of which is the URL specified in the link itself.
  • Clicks on the link “Aquarium of Genoa (Italy)” were reported as 5.1%, but this is actually shared with clicks on the thumbnail to the left of that link. With Enhanced Link Attribution the thumbnail shows 2.3% and the text link 2.8%.
  • With Enhanced Link Attribution clicks on the search button got 10%. This button results in a form submission to multiple URLs (that include the search string). We also see 0.1% clicks on the language filter, which actually causes some JavaScript code to run.
To find out more about Enhanced Link Attribution, and how to use it on your site, please read our help center article.

We are rolling out this feature gradually over the next few weeks so please be patient if you don’t have the option to enable it in the web property settings. In any case, you can tag your pages now and start collecting Enhanced Link Attribution data today.

Posted by Barak Ori, Google Analytics Team