Tumgik
#I *might* (emphasis on might) be able to post a link to my current server tomorrow
smoozie · 3 years
Text
Hear me out guys,
What if I made ANOTHER server?! Because I wanna meet new people :D
Would anyone be interested in joining? I may want to keep it minors only 🤔
1 note · View note
Text
as usual, an unrealistic list of things I’d really like to get done over the three-day weekend, which is not super likely to go well considering I’m posting this at 5 fucking p.m. but whatever:
gaming-related
I have exactly a month left on my (so far unused, whoops) PC Game Pass subscription, so I need to go over my wish list again and identify
which games have achievements
each game’s average playtime so I can prioritize
which ones interest me the most (emphasis on spooky games because...it’s spooky season)
try Fallout 76 once it finally finishes downloading, because I played the free weekend on Steam before and this is Microsoft, and...I think my character should just be on their servers but I don’t actually know hahahaha yeah that super didn’t work, maybe in a week when our billing cycle restarts I’ll try redownloading and reinstalling it, and anyway I did download and test a couple other Game Pass games
cancel my current SWTOR subscription so I’m not still paying for that while focusing on Game Pass games
play one of a few Flash games on my to-play list, if there’s something short
misc/housekeeping
check out my current backup situation and see how hard it would be to modify, I mean at some point I have got to set up an actual system but for some reason that’s intimidating so if what I currently have is at all usable, I should add to it
and then verify my drivers. I don’t know what’s wrong with my PC and I’m really not sure how to figure that out but since Memtest86 ran for three fucking hours and came back clear, it seems like this is the next major step in the troubleshooting process okay I actually didn’t do this but I did try some other things that also didn’t work
finish claiming all the Black Panther comics
a tiny bit of room cleaning? maybe?? I actually already did a very tiny bit, and this is something (one of...many things) I could do while on a call with friends, which is also in my plans
open a couple packages from one of said friends, which keen-eyed readers might note was in a to-do list ages ago oops
check Tumblr drafts
work on modifying or fixing some masks that currently aren’t working well
send an email that’s been on my to-do list for...a while
actually another email would be a good one too
keep trying to get Hazy to learn that letting people handle her paws results in good treats, so we can make an appointment for a Petco nail trim (and ideally clip them ourselves, sometimes)
ah fuck I still need to finish my will
creative
mildly edit the short fic I posted a few days ago, give it a title, and toss it on AO3
as always, some typing would be really really good
so would...some writing...
make some more potion bottles with, uh, random stuff I’ve collected on recent walks around the neighborhood (other potion bottles with other random ideas I’ve had wouldn’t be a bad idea either...and I would like to try one of the Youtube tutorials I found for making tiny hourglasses, but I guess that’s probably not a priority)
do a little reorganizing in my giant to-do lists for a) 1/6-scale projects and b) lyrics for titles
doing more research on parts for a 1/6 female Loki is really not urgent but...I might want to...and some things are on sale right now...
repair Tiny Loki’s tiny mask
rewrite my paper list of prioritized projects, which I needed to do anyway, but now I’ve also lost the original and that’s very annoying (also make a pocket for it in my notebook so this is less likely to happen again)
make designs for a few new Pride Cap shields, maybe? it really would not take long to make just a few, and now is when I should be adding stuff to Etsy if I have any hope of like...holiday sales
for that matter, now would be an extremely good time to at least start planning what kinds of holiday-specific things (and/or other new listings) I might be able to make in time to list them on Etsy
mental health
write up a post for the ADHD Reddit and maybe other related places
experiment with Notion and Airtable as organizational options
research some bullet-journal layouts to see if anything seems like it would work for me
in general, spend some time just kind of...brainstorming the type of system that would be useful for me in keeping my shit together, so I have a better idea of what I’m looking for (also probably helpful to list like...the big problems I’m trying to fix)
see if Penzu seems like a good option for a keeping-my-shit-together strategy I have in mind from my latest therapy session, and if not, do a little research on other journal-type possibilities
shopping I probably shouldn’t be doing
make a Michaels order tomorrow when both coupons will be active, because...there are some Halloween things that are somehow already sold out at the nearest store but I still want them...and they’re available at the store all the way across town...so...
possibly go to an estate sale benefiting the rescue group where we got Scully and Hazy, which is also all the way over on the other side of town but if I’m going over there anyway, I might as well
some stuff in my Etsy cart that I don’t want to miss
ditto eBay, I think mostly in my cart but also check watch list
AliExpress is also having some sales and yes there are more tiny things I want to buy for Loki’s arcane workshop, shut up (but also if I’m going to buy another Hot Toys body, this time for Thor, I gotta...take some measurements)
politics
call legislators
I really don’t know why I bother but I’ve found a bunch more articles recently that I’d like to throw on Facebook
for that matter at some point I’m probably just going to do a Facebook post like “hey, if you care about me at all, please consider voting Biden,” which also probably won’t make a difference but like...there’s a tiny chance it might
actually write those Sierra Club letters to voters that I meant to do like...two weeks ago...and maybe also some postcards, idk
maybe go to a thing Monday afternoon
also maybe just like...look through my links and folders to see who’s doing textbanking? like I don’t necessarily have to do any of it this weekend, just figure out what’s available?
........hmm this is all a terrible idea, probably, in part because my brain is looking at this absurdly long list and still going “oh shit, oh fuck, we’re forgetting something major aren’t we!!!”
3 notes · View notes
wickedbananas · 6 years
Text
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
Posted by Tom.Capper
If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)
I’m going to focus on GA (Google Analytics), as it's the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.
Side note: Our test setup (multiple trackers & customized GA)
On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.
(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)
Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).
This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.
Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.
Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/
Overall, this table summarizes our setups:
Tracker
Renamed function?
GTM or on-page?
Locally hosted JavaScript file?
Default
No
GTM HTML tag
No
FredTheUnblockable
Yes - “tcap”
GTM HTML tag
Yes
AlbertTheImmutable
Yes - “buffoon”
On page
Yes
DianaTheIndefatigable
No
On page
No
I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:
Reason 1: Ad Blockers
Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.
Effect of ad blockers
Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.
Here’s how Distilled’s setups fared:
(All numbers shown are from April 2018)
Setup
Vs. Adblock
Vs. Adblock with “EasyPrivacy” enabled
Vs. uBlock Origin
GTM
Pass
Fail
Fail
On page
Pass
Fail
Fail
GTM + renamed script & function
Pass
Fail
Fail
On page + renamed script & function
Pass
Fail
Fail
Seems like those tweaked setups didn’t do much!
Lost data due to ad blockers: ~10%
Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.
Reason 2: Browser “do not track”
This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.
Effect of “do not track”
Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.
Setup
Chrome “do not track”
Firefox “do not track”
Firefox “tracking protection”
GTM
Pass
Pass
Fail
On page
Pass
Pass
Fail
GTM + renamed script & function
Pass
Pass
Fail
On page + renamed script & function
Pass
Pass
Fail
Again, it doesn’t seem that the tweaked setups are doing much work for us here.
Lost data due to “do not track”: <1%
Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.
Reason 3: Filters
It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.
For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.
Lost data due to filters: ???
Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.
Reason 4: GTM vs. on-page vs. misplaced on-page
Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.
I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.
By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.
Effect of GTM and misplaced on-page code
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Chrome
100.00%
98.75%
100.77%
99.80%
94.75%
Safari
100.00%
99.42%
100.55%
102.08%
82.69%
Firefox
100.00%
99.71%
101.16%
101.45%
90.68%
Internet Explorer
100.00%
80.06%
112.31%
113.37%
77.18%
There are a few main takeaways here:
On-page code generally reports more traffic than GTM
Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.
It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.
I also split the data by mobile, out of curiosity:
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Desktop
100.00%
98.31%
100.97%
100.89%
93.47%
Mobile
100.00%
97.00%
103.78%
100.42%
89.87%
Tablet
100.00%
97.68%
104.20%
102.43%
88.13%
The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.
Lost data due to GTM: 1–5%
Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.
Lost data due to misplaced on-page code: ~10%
On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.
Bonus round: Missing data from channels
I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.
Dark traffic
Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:
Untagged campaigns in email
Untagged campaigns in apps (especially Facebook, Twitter, etc.)
Misrepresented organic
Data sent from botched tracking implementations (which can also appear as self-referrals)
It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.
Attribution
I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.
Discussion
I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2skU6gW via IFTTT
1 note · View note
katerbees · 7 years
Text
Cadre part 2
Hey guys! It’s been a few weeks, but my daughter and I were both sick for two if them -_- and I’ve been more in a reading mood than a writing one. I don’t like to force chapters because I want them to flow naturally, so here is a short update <3
link to pt 1--https://katerbees.tumblr.com/post/164437694835/cadre-pt-1
Short summary: Lorcan was in a metal band called cadre, he and the other dudes are drug dealers for Maeve and there are lots of other gangs throughout the city, Ilken, Valg etc. Elide runs into trouble during a show one night and he saves her.
Chapter 7
Sunlight streamed into the suite. Elide awoke with a start. She felt like she had slept for years. She quickly took stock of the events of the night before. She had escaped the Ilken again. But they would still be looking for her. She had to get her information to Celaena as soon as possible. Once she did that, she would need to meet up with Aelin who could get her into witness protection. 
She remembered Lorcan mentioning Maeve. Maeve was his boss. Aelin had dealings with Maeve, she remembered hearing through the grapevine. She would need to find a way to bring it up. 
Speaking of Lorcan, where was he? She listened; she heard the sound pollution of the city and nothing else. 
It was then she noticed the smell of cigarette smoke.“Lorcan!?” She yelled.Sudden coughing came in response.“Are you smoking in the bathroom?” Elide yelled while rolling out of bed. Were they in high school? Elide walked over to the bathroom door and opened it.Lorcan was standing next to a window, his cigarette resting just barely outside the sill.
“This is a non-smoking room.” Elide chided
“Seriously?” Lorcan replied.
Elide stamped her foot. Her full lips pursing into a pout. Lorcan stared at her mouth. Was she really mad at him for breaking a dumb rule? She really was a good girl. Lorcan rolled his eyes and flicked his cigarette out the window.“I needed to smoke, I didn’t want to leave you here by yourself and have you flip out and think I’d ditched you.”“Oh.” Elide responded. She hadn’t thought about it that way.
“Well thank you then.” Lorcan started to get up.“But smoking is still pretty gross.”
“Yeah, thanks Mom, I know.” Lorcan replied.This time, Elide rolled her eyes and turned to leave the bathroom. Lorcan admired the view. While her shorts did have fluffy unicorns on them, they barely covered her cheeks. He hadn’t picked them for that reason, but at this exact moment he was glad he had.
Elide plopped down on the bed. “So…” she began. 
“Yeah.” Lorcan said, from the doorway. “What’s next?” He asked. “We can’t stay here forever. What’s your goal?”
“I have an item I need to deliver to a woman named Celaena. Once I deliver it to her, I need to find a woman named Aelin.”
Lorcan became frozen. It was an unusual name. “What do you need with this Aelin?” Lorcan asked tersely.Elide was picking up on some hostility. Hostility that would imply that he knew Aelin.
“Do you know her?” Elide asked, a sudden edge to her voice.
“Maybe. What do you need her for?” Lorcan asked, poorly dodging the question.“
I can’t tell you. I’m sorry. I can tell you that once I do find her, you and I can split ways. She will be able to protect me from Vernon.”
Lorcan snorted. “We clearly aren’t talking about the same person. The blonde bitch I know isn’t exactly the protective sort”
“What’s that supposed to mean?” Elide responded, surprised at how defensive she was becoming over someone she hadn’t met.“
That bitch tried to kill me and stole my best friend.” Lorcan snarled.
Elide raised her eyebrows. “Well half of that seems like a legitimate concern and the other half makes you sound like a child” she retorted. 
Lorcan was pissed. How dare she talk to him like that. She had no idea what he had been through and he had even stuck his neck out for her last night and how he and Whitethorn had been best friends before that woman showed up. 
“Fine.” Lorcan gritted through his teeth. “Let’s go find her so you can see what a doll she is. She threatened to kill me the next time she saw me so you can get a front row seat to how great and protective she is!” Every word building up to a crescendo of shouting.
Elide just crossed her arms. She had seen tempers flare in her day. This man did not frighten her. She hadn’t survived this long by being wrong about these things. “Sounds like a plan.” She called his bluff.  
Chapter 8
They rented the room for another night on their way out into the city. They would walk around looking for any information on Aelin’s current whereabouts. 
They wandered into a crowded pub at lunch time. Elide headed straight to the back of the room, spying an empty table for two. She sat down in her seat. Lorcan landed in the seat across from her. 
“Stop looking so surly” Elide commanded.A server walked over and handed them menus and took their drink order. Lorcan attempted to look more like an approachable person.   “Now, dearest husband.” Elide began, winking at Lorcan, “where did you say you last saw Ms. Aelin?” She asked, reaching over and holding Lorcan’s hand.
Lorcan froze. Why was she doing this to him? He wondered. Couldn’t he just take her to where he had last seen Aelin and been rid of her? It would have been so much easier. And yet, he knew he couldn’t. He let her hold his hand, he raised his head and met her gaze.
“Ah wife. Don’t you remember? It was down behind Mistward apartments. In the courtyard.” He responded, his voice dripping with sarcasm. “She and her boyfriend had a lovely chat with me about how much they would love to kick my ass.”
Elide kicked him under the table.“I’m sorry, I meant play some brass. I’m only good at stringed instruments you see.”
Elide tried hard not to laugh, but did end up giving him a hint of a smile. “As I was saying,” Elide continued, “It’s just so strange that we haven’t heard from our friend Aelin in so long. Especially since you two had trouble agreeing on the price of those mirrors the last time we hung out.” 
Lorcan suddenly became very aware of what this little trickster across from him was doing. He also felt the eyes of quite a few seedy individuals turn their way.Their drinks arrived, Elide began to slowly sip her glass of wine; Lorcan gulped down his beer. 
“Anyways,” Elide continued, “I hope Aelin gets that mess sorted out.” She lowered her voice in a fake whisper for added dramatic emphasis.  “I’d hate for anyone to find out about the missing mirrors.” Lorcan played along “Shhhh!” he gave her a concerned look and turned around to see who might be listening. To his admiration and surprise, several people were turning their heads ever so slightly to pick up their conversation.“Check please.” Elide said sweetly to the server the next time she came over. Lorcan started to say something to her
“Hush husband, we have visitors.” Elide said with a wicked smile on her face that he had yet to see before. 
18 notes · View notes
kadobeclothing · 4 years
Text
8 Social Media Platforms That Weren’t Founded in the U.S.
In 2018, North America discovered a viral new app called TikTok. Within just a year and a half of its launch, it reportedly grew to over 800 million active users.
As tech journalists and bloggers dug into the origin of the odd video app, some were surprised to discover that TikTok, which took America by storm, was actually inspired by an app created by the same company in China.
Researching social media platforms not founded in the U.S. can be helpful when trying to learn about audiences that live in other countries, especially those that live in highly censored areas like China or Russia. In areas like these, where people might not be able to access common U.S.-based networks, like Facebook or YouTube, tech companies have built their own social media empires. Whether you’re interested in international marketing, or just want to learn more about how audiences around the world interact with the internet, learning about the top global platforms that weren’t created in Silicon Valley can be an eye-opening experience. Below, I’ll walk you through eight of the most popular social media platforms around the globe, including TikTok’s origin platform — Douyin. 8 of the Biggest Global Social Media Platforms QZone and QQ Owner: Tencent Origin: China (Available globally, Chinese language-only) Name’s Origin: QZone and QQ were shortened from the original name, OICQ. The O stood for “open” while ICQ is an instant message term that sounds like “I Seek You” when said allowed. Reported Users: 517 million active users on QZone with 653 million monthly users on QQ. QZone is a social media channel while QQ is a messaging app that links to a QZone account. The two apps serve as an alternative to Facebook in countries like China and South Korea, where the U.S. platform is blocked. The overall platform’s story began in 1999 when QQ launched as a desktop messaging site. QZone, a social media site and app, launched in 2005. In 2019, as mobile-first mindsets grew in millennial and Gen Z audiences, Tencent transformed the QQ website into a standalone app. QZone still serves as a social media platform while QQ is now similar to the Facebook Messenger app. To help you better visualize how people have used QQ and QZone, here’s a quick analogy: When U.S. millennials like me were children, we raced home from school to message our friends on AIM. Then, as we reached our high school years, we ventured onto Facebook, where we could message people, create a profile, and post updates. Meanwhile, in China, people in my age group might have messaged friends on QQ’s messaging website instead of AIM. Now, QQ users might use its sister app — QZone — for a social media experience that’s comparable to Facebook. Then, to message friends, they use QQ. Here’s an example of a current QZone profile that shows just how similar the platform is to Facebook:
Source On QZone, users are encouraged to publish posts, videos, or even music. Like Facebook, they can also connect with friends, see a feed of updates, comment, share or react to posts, and update cover or profile photos. The QQ and QZone are great examples of a social media brand that gained traction long before we signed on to well-known platforms owned by U.S. tech firms. The usership of QQ and QZone might be so strong in China because young audiences discovered the brand for its messaging tool and were able to join and enjoy its full social media network later on. Although Facebook launched 2004, one year before QZone came out, Tencent had already captured the millennial demographic with QQ, figured out how to grow its product competitively with QZone, and continued to add features to pull in Gen Z audiences. While QZone and QQ seem like Facebook alternatives, their history of growth is fairly parallel. And, as one of the ten biggest social media platforms globally, QZone is worth knowing about if you plan to market to Asian territories with heavy censorship. WeChat/Weixin Owner: Tencent Origin: China (Available globally) Name’s Origin: The app’s original name was Weixin, which translates to “micro letters.” In a play to become a more globally used app, Weixin rebranded to WeChat in 2012, Reported Users: 1 billion active users WeChat launched in 2011, before competitors like Kik and WhatsApp. The messaging app was rolled out by Tencent, which also created QZone and QQ. Similarly to WhatsApp, WeChat allows free text-based chats and voice calls. It also has a Moments feature similar to Instagram or Facebook Stories. Like many U.S. platforms, you can place fun animated stickers and filters on images or videos sent within the app. To join WeChat, users need to know someone on the platform and have them scan an activation code that a user receives when logging in for the first time. Once users are logged in, they can access WeChat’s messaging feature, send video messages to friends, and host virtual phone calls with one or more contacts.
Source Aside from using the platform for friend-to-friend communication, users can also take advantage of more entertaining features. For example, while one feature allows people to play in-app games live against friends, others include video or photo filters that can be used in chats or video calls. Here’s a quick video that shows a popular game being played on the WeChat app:
youtube
WeChat also connects to a number of third-party or Tencent-owned apps so that a user can take on multiple tasks, like hailing a ride or paying friends money, directly from WeChat.
Source At this point, Facebook, Snapchat, WhatsApp, and other major social media apps are using a similar strategy of pulling additional features into their platforms to keep users logged in longer. For example, many of today’s top messengers allow group video calls, offer in-app games, or let users send money. Weibo (Also known as Sina Weibo) Owner: Sina Corp. Origin: China (Available globally with text in Chinese.) Platform Name Translation: Loosely translates “microblogging.” Reported Users: Projected to hit 500 million monthly active users in 2020 Weibo, also called Sina Weibo, is the biggest Chinese microblogging and social media platform. The social channel made news in 2017 when it reached more monthly active users than Twitter. Since then the app’s continued to grow. Reports suggest that more than 30% of Chinese internet users now have an account on the platform. When you visit the website, it’s automatically in Chinese. However, you can use Google Chrome’s translator to convert the text to different languages. To give you a better idea of how Weibo works, here’s what its desktop site looks like with the English translation turned on.
Like Twitter, Weibo offers a central feed where you can see the latest or highest performing posts. To toggle through different types of posts, you can click a category on the list in the left sidebar. On the left, you can log in to an account and see “Hot Topics” similar to Twitter’s “Trending Topics.” If you continue to scroll down, you’ll also see a box that allows you to search for people — similarly to Twitter’s “Who to follow” block. To help you see just how similar these two platforms are, here’s a picture of Twitter’s feed.
While Weibo’s layout and microblogging mission is similar to Twitter’s, the Chinese platform emphasizes videos, photos, and trending content, while Twitter still puts an emphasis on the accounts sharing the content. Although Twitter also prioritizes trending content, you can easily see which of your followers posted, liked, or retweeted it before seeing the content in a tweet. This allows audiences to subconsciously focus on the person who tweeted something as well as the tweet itself. These slightly different layouts might also suggest that the audiences of each platform prefer to consume content differently. While Twitter’s audience might prefer the human connection aspect of its site, those on Weibo might want to jump straight to the content. Douyin Owner: ByteDance Origin Country: China (Available only in China: TikTok is available globally) Name’s Origin: Douyin loosely translates to “shaking sound” or “vibrato.” Reported Number of Users: 400 million active users Douyin is the Chinese-only counterpart to TikTok, which is owned by the same company. TikTok specifically is a merger of Douyin and the Musical.ly lip-syncing app. Both social media apps specialize in allowing users to create, edit, and share short-form vertical video that’s less than 60 seconds. After Douyin launched in early 2018, it grew to 150 million active users in year one. ByteDance then purchased a similar lip-syncing app called Musical.ly and created TikTok, a global app that merged the best features from each original app. Since launching in late 2018, TikTok’s also seen viral growth by gaining roughly 800 million active users. Like Douyin, TikTok also sees a large number of international influencers from countries like India. While Douyin and TikTok are very similar short-video apps and are often confused as the same thing, they have a slightly different set of features, are powered by different servers, and have different levels of censorship due to regulations around the world. Douyin isn’t accessible for us to preview in the U.S., but news outlets describe it as being even more advanced than TikTok — particularly when it comes to ecommerce. For example, while TikTok just recently rolled out advertising options, Douyin reportedly allows users to triple-tap a video with a product featured in it to go to a brand’s ecommerce store. Users can also use the app for virtual tours of hotels, stores, or travel locations, and then visit the websites affiliated with these tours. Unlike TikTok, Douyin also reportedly allows users to geotag themselves at a store or location. This could be helpful for providing brand awareness to local brands. When it comes to what Douyin and TikTok have in common, both are apps where users can share short videos that can feature musical overlays, fun filters, and text overlays. Videos on each platform must be 60-seconds or less. From photos I’ve dug up online, it looks like the platform’s design is very similar with some slight design differences. The screenshots below show a video ad and the platform’s search feature:
Source As a comparison, here’s what a video ad and the search feature look like on TikTok:
Both apps are also centered around a video feed that algorithmically shows users videos from followers or videos that they might enjoy based on previous content they’ve viewed on the app. When users enter the app and see the feed, one video will begin playing automatically. To see other videos, users simply swipe up on that feed. Here’s what this looks like on TikTok:
Aside from the shared features, both fast-paced social media channels are favored by younger generations — especially Gen Z. While over 50% of TikTok’s audience is younger than 34, it’s also been reported that Gen Z has flocked to Douyin. Douyin and TikTok’s success story is a great example of how one unique international platform can change the way we consume content in other countries. Prior to TikTok’s launch in the Western hemisphere, many Americans hadn’t seen anything similar to it since Vine — a similar short video app and feed that was discontinued by Twitter in 2016. Not only did Douyin and TikTok replace the need for Vine, but both apps also catered to Gen-Z and Millennials, two mobile-first generations known for short online attention spans and heavy consumption of video content. Kuaishou Owner: Kuaishou Technologies Country of Origin: China Name Origin: No translation could be found. Number of users: 400 million monthly active users Kuaishou is a short-video app that competes with Douyin. This platform started as a GIF sharing site called Kuaishou GIF. It was like GIPHY in the sense that anyone could create, post, and share GIFs on the network. In 2012, Kuaishou dropped GIF from its name and pivoted to a short video platform. Its interface is now very similar to Instagram’s. It doesn’t have a Stories feature like Instagram, but it features similarly looking profile formats and feeds where users share short vertical videos rather than photos or other content.
Source Like Douyin or TikTok, the video features of Kuaishou seem fairly similar in that users can leverage sound bites and basic editing tools in their content. Like many social media platforms with videos, Kuaishou also allows users to overlay text and stickers to images or videos.
Source From looking at a variety of videos that have been uploaded on to other sites from this platform, it seems like many Chinese residents have used this app to highlight their unique skills and musical talent. Here’s a video that highlights multiple short videos on the platform: https://www.youtube.com/watch?v=r0JCLvFVfDU A number of users have also shared videos that highlight and celebrate aspects of rural China. Here’s an example that Kuaishou posted on its own YouTube page:
youtube
In 2019, Apple distinguished Kuaishou as a global app that was defining mobile video storytelling trends. This came shortly after the app notably added ecommerce and monetization features that enabled creators in poverty-stricken rural China to make money from it. Aside from its short-videos, Kuaishou also allows users to live stream content to their friends or followers on the platform. Similarly to the short video feature, the stream has allowed users to film longer videos that highlight their skills. VKontakte (VK) Owner: Mail.Ru Group Country of Origin: Russia Name’s English Translation: VKontakte loosely translates to “in contact with.” Reported Number of Users: 500 million users VK, the biggest social channel in Russia, is like a mix between Facebook, YouTube, and the illegal ’90s downloading service LimeWire. This platform allows users to publish and share text-based posts, photos, video files, and music files with their connections. When it comes to this platform’s user interface, it looks eerily similar to Facebook, with some slight tweaks to page layouts. Here’s what VK’s main feed looks like:
Source And, here’s an example of what a profile page looks like:
Source Aside from allowing users to share their own content, they can allegedly upload and stream copyrighted material such as music and movies. This has allegedly been abundant on the platform since its launch, but legal steps have not been taken against it. The social platform is owned by Mail.Ru Group, a tech company that owns Russia’s main search engines and OK.RU, a social platform noted below. Reports also suggest that this company has affiliations with the Russian government. At the moment, some experts suggest that liberal-minded internet users spend time on Facebook, Russia’s more conservative users spend time on Mail.Ru-owned sites. In a recent Vice article, one user from Moscow described how Facebook was seen as a liberal network. “It’s become more ‘woke’ to be on Facebook because most Russian liberal intellectuals use it,” the user explained. “It’s kind of like Twitter for us, it’s where you follow people when you want to see what they think.” It is interesting to think that while U.S. internet users of all political backgrounds tend to argue their points on Facebook, people with certain political affiliations might stick to using one platform or another based on political affiliations in countries like Russia. Odnoklassniki (Shorthand name: OK.RU) Owner: Mail.Ru Group Country of Origin: Russia Name’s English Translation: Odnoklassniki translates to “classmate.” Reported Number of Users: 200 million users Odnoklassniki, or OK.RU, is a website that specializes in connecting people with their past classmates. The social media platform is set up like a blog, similar to Tumblr, where users can share life updates, images, or videos, which then show up in other user’s home page feeds. Along with their classmates’ content, a sidebar shows users the most popular videos that have been posted on the platform. Below is a look at the feed. While the site is in Russian, I’ve used Google Chrome to translate it so you can get a better idea of what people are posting.
Like Tumblr, users can also click on the name or image of another user in a post to find that person’s profile. These profiles are similar to Facebook’s layout in that they show a cover photo, a user’s top followers (called “participants” in the image below), and then a feed of their own content.
While this platform is similar to both Tumblr and Facebook, Russian citizens don’t commonly use those two platforms as much as OK.RU. Additionally, with warnings that Facebook, YouTube, and Instagram could be banned in Russia, it’s not shocking that tech companies in this region have innovated to create similar alternative platforms. As we’ve seen with the Chinese platforms on this list, many of the top global social platforms have been inspired by or created as an alternative to U.S.-origin sites that were censored or censored. These continent-specific social media industries could be a theme we continue seeing in different areas of the world. If you’re an international marketer, it can be helpful to identify segments of your audiences that might not be able to access the platforms you use for social media marketing. As you do this, you should also research any alternatives that you can access to promote your content or product in those audience’s locations. The Similarities of Global Social Platforms Despite their critical differences, social media industries around the world have some fascinating similarities to that of the U.S. For example, while many of North America’s most prominent social media apps are owned by Facebook or were created in Silicon Valley. Most prominent global platforms were created by Tencent or other major tech firms based in China. Similarly, the two biggest Russian platforms are owned by the same company. Although many of the apps above were created and launched in highly censored areas, they don’t seem to be falling behind on innovation due to these governances. Because international social platforms have evolved on a similar timeline as many common U.S. platforms, it seems like these geographies are building their own parallel social media industries, rather than just racing to create alternatives to our own platforms. One thing that does seem different between international and domestically launched social media platforms is that many of the examples on this list emphasize direct communication more than content, while our own platforms emphasize content and user experience. While every social media platform we use today does have messenger with various features, only a few of them are or were ever standalone messengers. For example, QZone and WeChat built their initial audiences as messenger apps and broadened their features from there. Meanwhile, Snapchat and WhatsApp are the only top platforms that started as direct messengers. While Facebook Messenger is one of the most prominent text messaging platforms, it is an expansion of Facebook’s original News Feed-centered platform. If you’re interested in reaching international audiences, you should pay attention to the social media platforms they most commonly use, and what they use them for. This will give you insight on what types of content these groups engage with, what motivates them, and what strategies could persuade buying decisions. If you want to tap into an international audience, but can’t access some of the platforms on this list, you can alternatively consider testing out a campaign on a globally accessible platform like WhatsApp and WeChat.
Source link
source https://www.kadobeclothing.store/8-social-media-platforms-that-werent-founded-in-the-u-s/
0 notes
king-shrug · 6 years
Text
7 Videos About SEO You Should Watch
Whenever I mention SEARCH ENGINE OPTIMIZATION, people always ask, what is usually SEO? These three make certain that the content that we all put out is high-quality, superior, and SEO-friendly. Pro tips: In MyTasker, we have seasoned content material writers and SEO experts who else can assist you to up-date your old blog posts structured on the trend. DA is SEO firm Moz's rank showing how respected a website is, based upon its link profile and additional factors (i. e. the quantity of backlinks pointing to several site from another site). Leveraging voice search is definitely another among the top SEARCH ENGINE OPTIMIZATION trends that will be influencing search engine optimization in 2018. You should outline your NAP data for your business at the start of your Local SEO campaign and keep it consistent. Chances are plenty of SEO experts have observed that only building a big number of links will never assist in ranking a website within the long terms. SEO will gain importance over all else as it assists you drive quality traffic, obtain visibility, boost your brand plus lend your business the trustworthiness it requires to succeed. Without a doubt, one particular of the biggest trends that will has already begun to consider place and will continue properly into 2018 is the loan consolidation of niche MarTech players simply by larger content cloud vendors, along with the role and importance associated with SEO increasing significantly throughout this particular transformation. The essential to successful SEO is focusing on long-tail keywords you need to be along with these search results because even in case you have thousands even thousands of social media following, this still won't be as eective as ranking high on the particular SERPs for your target key phrases. While obtaining as many pages indexed on the internet was historically a priority intended for an SEO, Google is right now rating the quality of web pages on your site and the particular type of pages it is definitely indexing. As mentioned earlier around 95% of general population can not move beyond the initial page SEO search results, therefore as a business you require an in-depth knowledge of the particular latest SEO techniques and begin incorporating them in your company today. SEO (or Search Engine Optimization ) is probably the most significant competitive digital marketing advantage that you could have over your competition. For internet marketers who were brought up inside the ‘traditional SEO market, ' 2018 is a time in order to adapt or die. Today, I'll discuss the on-page SEO techniques which will assist you to increase your position searching results in 2018. Long descriptions are usually great for Youtube video SEARCH ENGINE OPTIMIZATION and give you a location to add timestamps, useful hyperlinks, and an overall summary. SEO will be the process associated with maximizing a site's search motors visibility to connect with the potential users and customers throughout their search journey. 2 tools to help with nearby SEO are BrightLocal (for rankings) and MozLocal (for local research optimization). SEARCH ENGINE OPTIMIZATION professionals have highlighted the likelihood that Google will get a lot more specific in analyzing relevant content material. CRAWL it, such as Google does, with (for example) Screaming Frog SEO spider, and repair malformed links or things that will result in server errors (500), broken hyperlinks (400+) and unnecessary redirects (300+). SEO is mainly about keywords. See how I naturally included multiple variations of the keywords (called LSI keywords) that I am looking to target how to do seo”, seo tips and techniques”, search engine optimization wordpress”, and so forth Whether the marketer or a business proprietor, or maybe an agency for SEARCH ENGINE OPTIMIZATION, it's always good for analysis and learning new SEO methods. With that will in mind, here are several key tips for enhancing your own Google rankings, based partially for the latest SEMrush SEO study, plus partly on our own evaluation of marketing trends. Within SEO terms of speech, that will means much more long-tail key phrases included in the list. Taking the effort to understand even the basics of SEARCH ENGINE OPTIMIZATION can help your site get higher click-through rates, engagement, plus of course, rankings. This particular will lead to a far more individualized experience, while the rise associated with voice search and digital co-workers can offer the ideal floor to develop artificial intelligence plus reward successful SEO strategies that will keep up with the tendencies. This new paradigm associated with users relying on voice research for many of their research needs will be a video game changer for SEO. The internet users are almost all excited to know how brand-new SEO trends could affect their particular business this year. It's our own job to stay ahead associated with digital marketing trends so check out out these 9 SEO forecasts. The process associated with manipulating your content so Search engines and other search engines normally return your page near the particular top of the organic lookup is what SEO is just about all about. As Rand Fishkin pointed out in a Whiteboard Friday, content that is enhanced for keywords still holds beneficial SEO power. 9 forecasted SEARCH ENGINE OPTIMIZATION trends that will shape your own digital marketing strategies. Some SEO professionals and bloggers say that brief URLs ranks better in Search engines. Amazing article i don't know regarding youtube SEO but after reading through this article i fully realize how to rank video on the web and google also. The particular rise of featured snippets, PAY PER CLICK, voice search and local SEARCH ENGINE OPTIMIZATION can often yield better results compared to an organic ranking. SEO in 2018 is also expected to include some wonderful features with new technology and the trends are currently favoring the popular expectations. Content that will is clearly designed for SEARCH ENGINE OPTIMIZATION purposes is something that Search engines is already reducing - annihilation is the only way this can go. Dropping obvious key phrases into a piece of articles as many times as you possibly can, whilst not focusing on the high quality and relevance of the articles will get a site no place (apart from down! ). So, rather than packing plus cracking a keyword into your own content as much as a person can, white hat SEARCH ENGINE OPTIMIZATION will help you with the particular usage in such a method the content makes sense plus engages visitors well. I can clearly state that In 2018, Smart SEOs and even data-driven company can focus more on Searcher objective and less on keyword quantity, difficulty and links metrics. During this phase we furthermore perform tasks like keyword analysis (for existing pages), on page” SEO corrections, schema markups plus more. As far since I know, this only functions for HTML or CSS webpages - I don't go significantly for Flash websites, and was am not sure how that will pans out with respect in order to search engines and SEO. SEO is usually targeted on organic search outcomes — progressing to the top associated with the list when a consumer searches for keywords related in order to your business. With no Local SEO program in place your company can not be able to consider advantage of the local on the internet demand for your products or even services. If a person want higher rankings, you require to read his stuff : he's the Unicorn among the sea of donkey SEOs. Every business is limited in order to its own budget for customizing organic marketing results and the particular SEO professionals' aim is in order to make the most of elaborate allotted. Consequently, Link Building is one particular the most important in the particular list of ‘SEO trends 2018'. That's why a good consumer experience from an SEO viewpoint is more than your website's speed. For instance, several businesses miss the mark along with SEO and images, and nevertheless rank well. If you want increased rankings, you need to go through his stuff - he's the particular Unicorn among a sea associated with donkey SEOs. Consumer Experience as such has a good enormous influence on SEO. Off-page SEO very efficiently within promoting your company where interpersonal media, bookmarking sites, forums, weblog directory, Q&A, articles, videos, picture and infographic sharing, and record sharing play well. So exactly how do the SEO masters cope-up with the ever-changing context through the search engines? For instance, let's imagine we want to create a blog post about SEARCH ENGINE OPTIMIZATION because we are trying in order to build a inbound link in order to an SEO website. That will number will probably tick upward as SEOs become more advanced in their strategies and Search engines is constantly on the location so much emphasis on hyperlinks. In today's rapidly moving world, SEO techniques can modify on the dime—and the most severe part is that you basically might not even know this. Hacks which could have gained you a front-page result mainly because recently as 2016 are outdated now, but they may also hurt your website's rankings. It's a really great and knowledgeable blog to enhance your SEO ranking. Take some time plus find out about your Meta titles, description, URL readability plus how to earn featured thoughts and site links SEO's close to the globe are taking click through rate seriously claiming it in order to be one of the best ranking signals. Bing has confirmed that will they track unlinked brand plugs and use them as the ranking signal — and the patent by Google (along along with observations from many SEO experts) indicates that Google may end up being doing this as well. I've seen success with producing videos lately and am searching forward to utilizing your SEO guidelines to help my videos position on page 1. Plus SEO professionals are absolutely thrilled about this new opportunity due to the fact featured snippets provide a opportunity for low-ranking pages to reach the particular top of search results along with almost zero effort. The objective of a increased ranking in the SERP s i9000 is to increase quality plus quantity of the visitors the site by creating content plus resources that are relevant in order to target audiences searching for specific information-SEO helps businesses get discovered online. Within this guide I'll explain to you exactly how to get the most SEARCH ENGINE OPTIMIZATION value out from the Google Keyword Adviser. As a CMO, you more than likely cringe when you hear the phrase SEO” (search engine optimization), because why don't face it — it's not probably the most pleasant marketing tactic to strategy, strategize and execute. First, understand that schema markup is one of the particular most powerful, least used components of SEO today Schema are usually basically brief snippets of information that can give extra details to find users and search motors. We already understand that User Experience for SEARCH ENGINE OPTIMIZATION will become increasingly more important within 2018, such as UX is usually a major ranking factor. Along with increasing your transformation rate, reviews may also assist to increase the effectiveness associated with your SEO efforts, because user-generated content is a consistent supply of unique and fresh content material that you didn't have in order to pay an employee or service provider to write for you. Posts in this blog are readable plus informative, sharing news about Search engines, SEO trends and more. With brand new technologies and techniques, algorithm improvements and trends, SEO is permanently evolving and offering new possibilities for marketers to rank highly looking results. I feel that specialized SEO mistakes that affect get budget - and also dirty Google with non-SEO-friendly content like as social landing pages, Wp media archives, offer pages plus cloned e-commerce product pages : will have a more harmful effect on sites moving forwards. Using the effort to comprehend actually the basics of SEO may help your site gain increased click-through rates, engagement, and associated with course, rankings. Getting experience that spans multiple stations with an integrated mindset -- especially on SEO and PAY PER CLICK synergy, combined with a functionality mentality—sets up people who are usually new to the marketplace regarding success. Voice research is also an emerging pattern in SEO strategies. In spite of the fact that SEO provides the highest ROI of any kind of ecommerce marketing campaign, most on-line shops are put together along with little to no consideration associated with search engines like google. Consider these points while developing SEO strategy to get higher ranking in search results. The particular term SEO in relation in order to seo is also used in times to make reference in order to search engine optimizers, who are usually consultants that mange and help the development and completion associated with search engine optimization projects for clients. The faster you realize why Google will be sending you less traffic compared to it did last year, the particular sooner you can clean upward and focus on proactive SEO that will begins to impact your ratings in a positive way. Obviously, keyword ratings is another very important metric to track if you are usually analyzing your SEO efforts. Tone of voice search is one of the particular latest SEO trends in 2018. My SEO definition is the concentrate on strategies that may lead to placement around the research engine results pages (often known to as SERPS) when the user performs a search (query). I agree with the particular point you made of making use of LSI keywords and long-form articles as part of the SEARCH ENGINE OPTIMIZATION strategy. Thinking associated with local search only as MyBusiness optimization may limit the possibilities businesses (especially local businesses) may have to earn SEO presence and traffic. Optimizing your SEARCH ENGINE OPTIMIZATION strategy, which includes site rate and content, to reach your own mobile customer will become even more important than ever in 2018. Other SEO ranking factors include: accessible URLs, domain age (older is generally better), page speed, mobile friendliness, business information, and technical SEO. Concerning on-page SEO best methods, I usually link out to additional quality relevant pages on additional websites where possible and exactly where a human would find this valuable. So, to make it easier for you personally, we compiled a list of the 25 best SEOย tools you need to know and grouped them into five neat categories: keyword research tools, technical SEO, backlink analysis, link building and rank tracking. We looked at hundreds associated with SEO companies to decide which usually ones would make our ranks list. Thus, really time for mobile optimization with regard to better SEO result. SEO means helping businesses perform everything they can to create more money from those which are visiting their website through organic search. Google provides been actively encouraging businesses in order to convert their websites to HTTPS, and those websites are becoming rewarded with a minor SEARCH ENGINE OPTIMIZATION boost. Now please bear in mind just about all of this is my viewpoint on SEO but remember that will is from someone who is definitely getting results with search motor optimization and to prove this particular you can run a couple of Google searches to see that will yes Infobunny gets results towards some very, very strong respected competition. All of these types of strategies wrapped up are identified as SEO, or seo. At the most basic degree, SEOs need to think regarding the fact Google can significantly closely imitate what a individual user can do. Search engine optimization has in order to evolve with the times, yet not in the way almost all people describe it. Every period I see someone write SEARCH ENGINE OPTIMIZATION has changed a lot within 20 years” I laugh out there loud. Keyword research is one associated with the most important aspects associated with SEO. It's that will period of year again, whenever we see an influx associated with SEO articles forecasting trends in order to look for in the fresh year. I move into much more detail within SEO Page Titles: 15-Point Register for B2B and B2C Manufacturers, which explains the best methods to work in relevant key phrases that accurately reflect the web page content. The main thing, however, will be that Local SEO is almost always cheaper and more efficient than traditional marketing. Lookup Engine Optimization; short for SEARCH ENGINE OPTIMIZATION, is a process coping along with the optimization of websites within a way that they can get high ranking within the particular search engines. SEOs will start to optimise content for incredibly defined user locations, like close to among their shops or within a cafe district, to assure they capture every potential certified lead. But along with the most recent changes in the SEARCH ENGINE OPTIMIZATION industry, getting this coveted #1 spot may not be good enough to get you all the particular traffic that you might want for your company. While it's impossible in order to keep pace with every revise in your efforts to boost your own page ranking, there are a few key SEO trends and strategies to follow in 2018. SEOs conveniently call this effect ‘domain authority' plus it seemed to become related to ‘PageRank' - the machine Google started to rank the particular web within 1998. Ranking for seo” is hard. An accessible URL will be an important SEO ranking aspect. Key phrase research is an important section of SEO. Google's CEO Sundar Pichai announced that will voice searches make up in order to 20% of mobile queries, which usually puts new challenges in top of SEO experts. SEO stands for lookup engine optimization. ” It is usually the procedure for getting visitors from the free, ” natural, ” editorial” or natural” lookup engine results on search motors. This typically involves using advanced SEO tags (e. g., tags) that search motors like google reference when the more visual search engine outcomes page is displayed. Focal points for SEO in 2018 functions suggestions on what to prioritise in 2018, including on-site research, topical, local and mobile SEARCH ENGINE OPTIMIZATION and where SEO capabilities ought to sit within organisations. We might construct a blog post that will has the SEO related key phrases in it and we would certainly put the link of the particular website that people are trying in order to build the back link in order to within text of that blog page post. My SEARCH ENGINE OPTIMIZATION guide will break search motor optimization down for you. Meta description is not a immediate SEO ranking factor but this helps in ranking indirectly. Once such an interview gets published, it nearly always gets a lot of backlinks and SEO value. SEO a lot more usually discuss domain trust plus domain authority based on the particular number, type and quality associated with incoming links to a web site. As electronic assistance gets more accurate generally there is a great opportunity each for SEO and content, reaping helpful benefits from a growing market that increases the brand with a consumer in a unique but nevertheless relevant and useful way. Not simply search engine algorithm changes yet also the way in which usually people search keeps changing, therefore, considering the above 10 SEARCH ENGINE OPTIMIZATION trends- digital marketers and company owners could make their site ready for ranking high within 2018. If you look at huge manufacturers like Amazon and Walmart, a person see how powerful SEO may be. When you search with regard to just about any consumer product on Search engines, Amazon and Walmart are even more than likely to appear upon top. In the similar vein, you can't go to an SEO blog at the particular moment without seeing a function on Voice Search. As a matter associated with fact, the power of any kind of website lies in the DE UMA. Domain Authority is SEOmozs computed metric for how well the given domain will probably position in search results. SEO training courses will create you well-versed with various SEO strategies like link building, keyword analysis, optimizing content by using the particular right keywords, optimizing the web site structure, off-site SEO, PPC marketing and many more. Therefore in addition to adapting the particular SEO technique of writing content material based on keywords, it will SEO 2019 Slide be important for businesses to create a responsive website, both with regard to mobiles and computers. SEO” stands regarding Search Engine Optimization - SEARCH ENGINE OPTIMIZATION is the process of obtaining your website to rank increased in search engines. The best thing a person can do to assist your own local SEO ranking is in order to be sure that all existing listings possess the correct Name, Address and Phone Number (NAP) and web site URL.
0 notes
whatevernevermind · 6 years
Text
7 Videos About SEO You Should Watch
SEO or perhaps known since Search Engine Optimization. Because when somebody searches for SEO via tone of voice search, he would speak ‘what is SEO' or ‘How in order to do SEO' rather than just requesting SEO. The search engine informs 500-600 updates annually and the particular SEOs need to comply along with these changes to ensure that will their websites follow legitimate marketing practices. They are considered a crucial Local SEO search engine ranking factor. SEO within 2018 means greeting the consumer with the exact information these people are looking for at the particular exact time they are searching for it. This could be on the variety of platforms, from tone of voice search to local businesses plus online shopping. Google algorithms for SEO are usually quite complex and are getting updated constantly to keep company quick-witted and responsive. At the same period, the rise of messaging applications and personal assistants has introduced new challenges which SEO should continue to overcome in 2018. On the particular other hand, traffic coming through SEO is free and organic” and can take a lengthy time to obtain recognized, whilst the traffic generated from PAY-PER-CLICK (Pay Per Click) and SEARCH ENGINE MARKETING (Search Engine Marketing) is just not free of charge and companies pay for to show up on the top of the particular Search Engine to get fast not organic results. So, I might declare 2018 is a problem for Google, just as very much as it might be intended for SEOs. We have several good rankings, but we are usually always trying to improve and observe what the future holds with regard to SEO. In 2018, the user experience is the particular center of development as nicely as the key point regarding the SEO strategies. SEO continues to be king of the Internet, plus businesses will continue to require valuable, and unique content that will will meet the need associated with their company in the yrs ahead. Backlinks are one of the most significant SEO factors. In order to give you a head begin, here are the SEO styles and techniques we expect in order to dominate in 2018. Debbie A. Everson is the particular CEO of, experienced SEO Specialists and Search Engine Optimization Company to 2, 000 small companies. Off-Page SEO refers in order to all the things that may be done directly OFF your own website to help you much better search engine positions, such since social networking, article submission, community forum & blog marketing, etc. SEO (or Search Engine Optimization ) is probably the most significant competitive digital marketing advantage that you could have over your competition. For internet marketers who were brought up inside the ‘traditional SEO market, ' 2018 is a time in order to adapt or die. Today, I'll discuss the on-page SEO techniques which will assist you to increase your position searching results in 2018. Long descriptions are usually great for Youtube video SEARCH ENGINE OPTIMIZATION and give you a location to add timestamps, useful hyperlinks, and an overall summary. SEO will be the process associated with maximizing a site's search motors visibility to connect with the potential users and customers throughout their search journey. 2 tools to help with nearby SEO are BrightLocal (for rankings) and MozLocal (for local research optimization). SEARCH ENGINE OPTIMIZATION professionals have highlighted the likelihood that Google will get much more specific in analyzing relevant articles. CRAWL it, such as Google does, with (for example) Screaming Frog SEO spider, and repair malformed links or things that will result in server errors (500), broken hyperlinks (400+) and unnecessary redirects (300+). SEO is mainly about keywords. See how I naturally included multiple variations of the keywords (called LSI keywords) that I am looking to target how to do seo”, seo tips and techniques”, search engine optimization wordpress”, and so forth Whether the marketer or a business proprietor, and even an agency for SEARCH ENGINE OPTIMIZATION, it's always good for analysis and learning new SEO methods. With that will in mind, here are a few key tips for enhancing your own Google rankings, based partially around the latest SEMrush SEO study, plus partly on our own evaluation of marketing trends. Within SEO terms of speech, that will means much more long-tail key phrases included in the list. Taking the effort to understand even the basics of SEARCH ENGINE OPTIMIZATION can help your site get higher click-through rates, engagement, plus of course, rankings. This particular will lead to a far more individualized experience, while the rise associated with voice search and digital co-workers can offer the ideal floor to develop artificial intelligence plus reward successful SEO strategies that will keep up with the developments. This new paradigm associated with users relying on voice lookup for many of their lookup needs will be a video game changer for SEO. The internet users are most excited to know how fresh SEO trends could affect their particular business this year. It's the job to stay ahead associated with digital marketing trends so check out out these 9 SEO forecasts. The process associated with manipulating your content so Search engines and other search engines normally return your page near the particular top of the organic research is what SEO is most about. As Rand Fishkin pointed out in a Whiteboard Friday, content that is enhanced for keywords still holds important SEO power. 9 forecasted SEARCH ENGINE OPTIMIZATION trends that will shape your own digital marketing strategies. Some SEO professionals and bloggers say that brief URLs ranks better in Search engines. Amazing article i don't know regarding youtube SEO but after reading through this article i fully realize how to rank video on the internet and google also. The particular rise of featured snippets, PAY PER CLICK, voice search and local SEARCH ENGINE OPTIMIZATION can often yield better results compared to an organic ranking. SEO in 2018 is also expected to include some wonderful features with new systems and the trends are currently favoring the popular expectations. Content that will is clearly designed for SEARCH ENGINE OPTIMIZATION purposes is something that Search engines is already reducing - termination is the only way this can go. Dropping obvious key phrases into a piece of content material as many times as you can, whilst not focusing on the high quality and relevance of the articles will get a site no place (apart from down! ). So, rather than packing plus cracking a keyword into your own content as much as a person can, white hat SEARCH ENGINE OPTIMIZATION will help you with the particular usage in such a method how the content makes sense plus engages visitors well. I can clearly state that In 2018, Smart SEOs and even data-driven company can focus more on Searcher intention and less on keyword quantity, difficulty and links metrics. During this phase we furthermore perform tasks like keyword analysis (for existing pages), on page” SEO corrections, schema markups plus more. As far since I know, this only functions for HTML or CSS web pages - I don't go very much for Flash websites, and was am not sure how that will pans out with respect in order to search engines and SEO. SEO will be targeted on organic search outcomes — progressing to the top associated with the list when a consumer searches for keywords related in order to your business. With no Local SEO strategy in place your company may not be able to get advantage of the local on-line demand for your products or even services. If a person want higher rankings, you require to read his stuff : he's the Unicorn among the sea of donkey SEOs. Every business is limited in order to its own budget for customization organic marketing results and the particular SEO professionals' aim is in order to make the most of can be allotted. Consequently, Link Building is a single the most important in the particular list of ‘SEO trends 2018'. That's why a good consumer experience from an SEO viewpoint is more than your web site's speed. For instance, several businesses miss the mark along with SEO and images, and nevertheless rank well. If you want increased rankings, you need to examine his stuff - he's the particular Unicorn among a sea associated with donkey SEOs. Consumer Experience as such has a good enormous influence on SEO. Off-page SEO very efficiently within promoting your company where interpersonal media, bookmarking sites, forums, weblog directory, Q&A, articles, videos, picture and infographic sharing, and record sharing play well. So just how do the SEO masters cope-up with the ever-changing context through the search engines? For instance, parenthetically we want to create a blog post about SEARCH ENGINE OPTIMIZATION because we are trying in order to build a inbound link in order to an SEO website. That will number will probably tick upward as SEOs become more advanced in their strategies and Search engines is constantly on the location so much emphasis on hyperlinks. Many business individuals find checking up on the particular "moving target" of SEO distracts them from daily priorities a lot more than they ever imagined, consequently it is good to seem closely at why is feeling for each business. Your readers are always the almost all important part in regards in order to a blog post, so in no way compromise writing quality for SEARCH ENGINE OPTIMIZATION benefit. But mobile SEO is various in terms of search habits, quality signals, levels of consumer engagement, and above all, rank algorithms. For this particular to happen a variety associated with local SEO strategies, need in order to be implemented to obtain the site positioned on search engines like Search engines, business directories such as Yelp, Superpages, Google My Business list etc. This will be a development opportunity for content marketing-specific firms and a necessary and validated budget line item for in-house SEO teams. According to experts, Content is usually the most critical element associated with SEO trends in 2018. Along with an increasingly crowded online room, staying on top of SEARCH ENGINE OPTIMIZATION trends and adapting to all of them can help you beat away the competition. Here are usually the most important SEO modifications you need to integrate directly into your 2018 content strategy. Just remember in order to pay attention to solid content material creation and copywriting fundamentals, participate your viewers deeply, and remain abreast of technical trends such as backlinks, SEO health, site rate, and schema. Lookup Engine Optimization (SEO) is the particular scientific art of optimizing your own website around specific keywords within order to rank higher looking results, such as Google. Gianluca is usually a Moz Associate, blogger, originator of the Web Marketing meeting The Inbounder and an SEARCH ENGINE OPTIMIZATION and Digital Strategist independent expert together with his company, 1 of the Agencies recommended simply by MozYou can find Gianluca upon Twitter sharing the best regarding web marketing and blabbing furthermore about space, politics and football. Posts in this blog are readable plus informative, sharing news about Search engines, SEO trends and more. With brand new technologies and techniques, algorithm improvements and trends, SEO is permanently evolving and offering new possibilities for marketers to rank highly looking results. I feel that specialized SEO mistakes that affect examine budget - and also dirty Google with non-SEO-friendly content like as social landing pages, Wp media archives, offer pages plus cloned e-commerce product pages : will have a more harmful effect on sites moving ahead. Having the effort to comprehend actually the basics of SEO may help your site gain increased click-through rates, engagement, and associated with course, rankings. Getting experience that spans multiple stations with an integrated mindset : especially on SEO and PAY PER CLICK synergy, combined with a efficiency mentality—sets up people who are usually new to the marketplace with regard to success. Voice lookup is also an emerging craze in SEO strategies. Regardless of the fact that SEO offers the highest ROI of any kind of ecommerce marketing campaign, most on the internet shops are put together along with little to no consideration associated with search engines like google. Consider these points while producing SEO strategy to get higher ranking in search results. The particular term SEO in relation in order to seo is also used from times to make reference in order to search engine optimizers, who are usually consultants that mange and help the development and completion associated with search engine optimization projects for clients. The faster you realize why Google will be sending you less traffic compared to it did last year, the particular sooner you can clean upward and focus on proactive SEO that will begins to impact your ranks in a positive way. Obviously, keyword ranks is another very important metric to track if you are usually analyzing your SEO efforts. Tone of voice search is one of the particular latest SEO trends in 2018. With most that said…SEO is still regarding content and links. Use long-tail key phrases, full sentences and questions -- although your present keywords are usually still important and play the big part in SEO, even more people are choosing to research in full questions now, since they're speaking, instead of within short term. This is the component of an SEO strategy that will focuses on building quality exterior referral links that trust the given site as an power or source in the focused categories. Long gone are the days where very much of what one did intended for SEO was purely for SEARCH ENGINE OPTIMIZATION (link building, highly optimized content material, content only for the various search engines, etc). Well, backlinks are the particular evergreen ranking factor of SEARCH ENGINE OPTIMIZATION, there's no doubt about this. But times are changing plus so does the things. I've been ranking video clips last few weeks for key phrase like SEO Outsourcing” and Wp training London” and some SEARCH ENGINE OPTIMIZATION agency terms here. So, to make it easier for you personally, we compiled a list of the 25 best SEOย tools you need to know and grouped them into five neat categories: keyword research tools, technical SEO, backlink analysis, link building and rank tracking. We looked at hundreds associated with SEO companies to decide which usually ones would make our ranks list. Thus, really time for mobile optimization regarding better SEO result. SEO means helping businesses perform everything they can to create more money from those who else are visiting their website through organic search. Google provides been actively encouraging businesses in order to convert their websites to HTTPS, and those websites are getting rewarded with a minor SEARCH ENGINE OPTIMIZATION boost. Now please bear in mind most of this is my viewpoint on SEO but remember that will is from someone who is definitely getting results with search motor optimization and to prove this particular you can run a couple of Google searches to see that will yes Infobunny gets results towards some very, very strong respected competition. All of these types of strategies wrapped up are identified as SEO, or seo. At the most basic degree, SEOs need to think regarding the fact Google can significantly closely imitate what a individual user can do. Search engine optimization has in order to evolve with the times, yet not in the way almost all people describe it. Every period I see someone write SEARCH ENGINE OPTIMIZATION has changed a lot within 20 years” I laugh out there loud. Keyword research is one associated with the most important aspects associated with SEO. It's that will period of year again, whenever we see an influx associated with SEO articles forecasting trends in order to look for in the brand-new year. I move into much more detail within SEO Page Titles: 15-Point Directory for B2B and B2C Brand names, which explains the best methods to work in relevant key phrases that accurately reflect the web page content. The main thing, however, will be that Local SEO is almost always cheaper and more efficient than traditional marketing. Lookup Engine Optimization; short for SEARCH ENGINE OPTIMIZATION, is a process coping along with the optimization of websites within a way that they can get high ranking within the particular search engines. Associated with all the SEO tools out there there, thousands of online entrepreneurs, small businesses proprietors, and SEO's prefer to use Ahrefs in order to help them improve their research rankings. Key phrases are like a compass regarding the SEO campaigns: they inform you the best and regardless of whether or not you're making improvement. Knoxweb understands search motor technology, and can set a person up with an SEO strategy which will see results. This particular technique involves collecting a listing of brand mentions of your own website using any SEO device and reaching out to internet sites to convert your brand describes into backlinks. As search engines always develop more advanced technologies plus responsiveness to online searches, fresh SEO techniques will emerge in order to keep companies ranking high upon search engines. If you increase your own site loading speed, develop the beautiful mobile-friendly design, and framework your website for users, a person will notice a positive effect on your SEO performance. SEO stands for research engine optimization. ” It will be the procedure for getting visitors from the free, ” natural, ” editorial” or natural” lookup engine results on search motors. This typically involves using advanced SEO tags (e. g., tags) that search motors like google reference when the more visual search engine outcomes page is displayed. Focal points for SEO in 2018 functions suggestions on what to prioritise in 2018, including on-site research, topical, local and mobile SEARCH ENGINE OPTIMIZATION and where SEO capabilities need to sit within organisations. We would certainly construct a blog post that will has the SEO related key phrases in it and we might put the link of the particular website that individuals are trying in order to build the back link in order to within text of that blog site post. My SEARCH ENGINE OPTIMIZATION guide will break search motor optimization down for you. Meta description is not a immediate SEO ranking factor but this helps in ranking indirectly. Once such an interview gets published, it nearly always gets a lot of backlinks and SEO value. SEO a lot more usually discuss domain trust plus domain authority based on the particular number, type and quality associated with incoming links to a web site. As electronic assistance gets more accurate presently there is a great opportunity each for SEO and content, gaining from a growing market that expands the brand with a consumer in a unique but nevertheless relevant and useful way. Not simply search engine algorithm changes yet also the way in which usually people search keeps changing, therefore, considering the above 10 SEARCH ENGINE OPTIMIZATION trends- digital marketers and company owners could make their site ready for ranking high within 2018. If you look at huge brand names like Amazon and Walmart, a person see how powerful SEO may be. When you search intended for nearly every consumer product on Search engines, Amazon and Walmart are even more than likely to appear upon top. In the similar vein, you can't check out an SEO blog at the particular moment without seeing a function on Voice Search. As a matter associated with fact, the power of any kind of website lies in the DE UMA. Domain Authority is SEOmozs determined metric for how well the given domain will probably position in search results. SEO training courses will create you well-versed with various SEO methods like link building, keyword analysis, optimizing content by using the particular right keywords, optimizing the web site structure, off-site SEO, PPC marketing and many more. Therefore in addition to adapting the particular SEO technique of writing content material based on keywords, it will SEO 2019 PDF be important for businesses to develop a responsive website, both intended for mobiles and computers. SEO” stands intended for Search Engine Optimization - SEARCH ENGINE OPTIMIZATION is the process of obtaining your website to rank increased in search engines. The best thing a person can do to assist your own local SEO ranking is in order to make certain that all existing listings possess the correct Name, Address and Phone Number (NAP) and web site URL.
0 notes
Text
The best ways to develop an SEO Avengers group to record your market Searchmetrics SEO Blog
Who is your favorite Avenger?
Recently I worked with a client to put together their SEO team. The question they posed was: “if we have unlimited resources and want to put together the best team for SEO, who and what do we need?”
Structuring (old and new) internal resources is a common challenge that’s hardly written about. I want to share my experience and knowledge about this topic with you, because it addresses different needs for different positions out there.
With their second movie expected to be one of the best in 2015, let’s use Marvel’s Avengers as an example to determine the type of specialists you need and why they need to work together.
Why is this article for you?
If you have to put together a team yourself, this article will help you bring structure to your approach.
If you are already leading a team of experts, it will provide insights on how you can further improve your team.
If you are working within an SEO team, this post can help you to better understand your role within the team. More importantly, it can provide you with an understanding of the direction you may want to develop your future career into.
If you don’t have the budget for a big SEO team or you now have the opportunity to have even more staff, you can learn how to either consolidate responsibilities or split them up more granularly.
Nick Fury – Head of SEO
Every team needs a Nick Fury, someone who can manage and lead the heroes. It’s the person everyone in the team is reporting to and that typically reports into upper management (head of online marketing, CMO or a VP).
Being the Nick Fury of your team means you need to be the General versus being the foot soldier. What I mean is that you don’t have to be as deep in the weeds as your analyst, content marketing manager or on-page optimization manager. It’s good to have solid knowledge of these topics and if you have developed yourself out of one of these positions – even better! But the core task is to create a high-level strategy that has a positive impact on the business.
The two main tasks of a team leader are:
Coordination
Decision-making
Coordination is a very broad term. Let me list out the subtasks that might be covered:
Prioritizing projects and tasks
Allocating resources to projects
Bringing different parties together, including internal departments and external clients/vendors
Empowering your team to improve itself and develop
Aligning tactics to fit to the businesses overall strategy and goals
Decision-making on the other hand is very clear and can be defined as: “deciding which way to go after evaluating all options”.
Ultimately, it’s about leading and empowering your SEO team, ensuring they perform at their best and can make the right decisions.
Iron Man – Analyst
Every team needs their very own Iron Man (or at least someone who partially does his job)! Without an analyst, it’s like driving a car with your eyes closed – it’s only a matter of time until you crash.
As an analyst, it’s your responsibility to draw conclusions from the data and provide the members of your team with answers. Decisions have to be data-driven, so if you don’t provide the data, good decisions cannot be made.
The two main responsibilities are:
Reports
Analysis
Reports must contain actionable data with crystal clear answers. The data from various sources needs to be connected in one dashboard. However, it’s important to note that depending on the businesses need, one dashboard cannot fit all parties and therefore it must be targeted. At Searchmetrics we are able to provide dashboards for various people including, C-Level, SEO team, Dev-Team, editors and more. All of these reports are targeted and have different meanings.
We typically distinguish between ad-hoc and ongoing reports. An example for the first would be a question such as “how well is our new landing page performing?” While another example for the latter is, “how has our traffic developed over the last three years?” Ad-hoc reports answer an initial one-time question, while ongoing reports serve the purpose to measure success over time.
The analysis needs to be done first in order to create the reports. It’s defined by the collection, segmentation and procession of data. Here’s a taste of the type of metrics that are analyzed:
User signals (bounce rate, time on site, pages / visit, etc.)
Leads / revenue per keyword / topic / URL
Low hanging fruit rankings (#11-20)
Brand vs. non-brand rankings
Traffic per social network
Other assets that are analyzed could be:
Competitors
Own website(s)
Social networks
Search engines
One analysis a lot of people tend to forget about is the server log file analysis. This analysis is tremendously helpful as it provides insights on how search engines crawl your website, i.e. where they get stuck, how long it takes them to crawl, etc. By matching this data with your URL rankings, you can create a very powerful report. (But this topic demands a blog article on its own.)
Every company is different; therefore analysts have slightly different tasks. The main purpose of an Iron Man is the same though: provide insights to make the best decisions for the business.
Hulk – Content Manager
Content is one of the most important assets for a website, therefore you need a Hulk in your SEO Avengers team to optimize, maintain and drive it. Content is not only text, it can be videos, pictures, graphics, whitepapers, PDFs, etc.
Each of these assets have to be created, managed and optimized over time. Nowadays it’s not enough to simply create a piece of content, park it on your website and forget it there. It has to constantly be optimized and you need to understand how users like and interact with it in order to drive traffic and revenue.
A content manager needs to have different touch points with the:
Social Media manager
Analysts
On-Page Manager
Media manager
Editorial team
Of course someone has to produce the content and in most cases it’s too much for the content manager to do everything by him or herself. It’s the content managers task to coordinate the editorial team, guest bloggers and other content producers. The produced content needs to also be reviewed (to a manageable degree) and analyzed for search engine relevance (not to say that content is for search engines only, but the best content has to be found).
It makes sense to work closely together with the responsible person for social media, Black Widow maybe ;-), to promote new and existing content, find out how and if content is shared on social networks and how to streamline it by analyzing how people talk about it and what they think. It’s not an easy task, but if you’re able to work closely together and set up processes, you can make your content better than anyone else’s.
Working together with the On-Page manager (Captain America), the Hulk of your team has to figure out which topic / keywords are supposed to rank for which URL. This is a crucial process that never ends, but can make a site really successful when done right. A part of that is also identifying new topics in order to grow, of course after you’ve covered all obvious topics for your business.
An important tool of a content manager is a content calendar. On it, all recurring events that either impacts the general population (holidays like Christmas, world wide events like the Fifa World Cup) or the industry (like the WWDC) should be listed. This allows you to create content in advance and rank for related keywords/topics.
I don’t want to talk about content marketing too much as striving to provide the best content overlaps with content marketing. However, there is still a part of initiating campaigns that have to be targeted, seeded and monitored. This is also part of the content manager’s job. He needs to leverage touch points with the creative department for the implementation and the analyst (Iron Man) for input and inspiration.
Tip: Don’t forget internationalization: managing content in different languages and for different countries can become a big part of a content manager’s work. If you want to be successful, you can’t just Google Translate.
Captain America – On-Page Manager
The part of Captain America is also the most technical part of SEO. With good teamwork between Captain America, the Hulk and Iron Man can you and they perform the best.
Tasks of the On-Page Manager are to coordinate:
Regular site audits
Crawls of all kinds (site-wide, folders, test environments)
Optimization of meta-data
Implementation and review of markup
Management and optimization of site architecture:
Internal linking
Status codes
Hierarchy of pages
URL-Structure
Backlink / Link juice management
Site speed optimization
Mobile version of the site
One major responsibility is the coordination of the dev-team. It’s often a challenge, but successful SEO demands figuring this out. A certain share of developer resources has to be attributed to SEO, processes have to be managed well and an emphasis on agility and flexibility has to be made. In the best case, the On-Page manager has access to the CMS and can influence / change certain parameters by themselves (meta-data, content, media).
Another responsibility is conducting regular site audits and crawls, which can overlap i.e. figuring out weaknesses within site that decrease user experience and search engine accessibility. I recommend to audit the entire website at least once every few months and crawl every big release on a test environment. Hunt for problematic status codes, especially 4xx, 5xx, 302s and redirect chains.
Backlink management is another big chunk of responsibility the On-Page manager has to carry. It’s not focused on getting new backlinks for the site, but to:
Determine which pages need more backlinks and provide this information to the content manager to plan accordingly
Analyze how the current incoming link juice is spread across the site and either adjust incoming links (e.g. change the target URL) or the site structure
Analyze the backlink profile on a regular basis to ensure you are not in danger of penalties or algorithm updates
Captain America also has to maintain and optimize the mobile version of the site based on information from the Google Webmaster Tools, Google Analytics and server log file analysis. It’s a great touch point to exchange data with Iron Man in order to find the right spots to fix. User signals lead the way here.
Final advice
If the Captain America of your team does not get information from Iron Man (Analyst), like analytics on traffic, he cannot optimize the sites architecture. If he doesn’t get input from the content manager, he cannot fully optimize URLs. See where I’m taking this? Teamwork! You can only dominate a market when all parts of the SEO team function together. Tools to support this are regular meetings and project management tools like Atlassian JIRA.
Also, do not forget one of the most critical parts of SEO: staying up to date and develop oneself. Nick Fury has to ensure that staff is constantly challenged, but also get’s the resources to develop themselves and stay up to date by reading reports, like the Ranking Factors study and attending conferences.
Kevin Indig has been an SEO Consultant for the Searchmetrics Pro Services team. He helps enterprise companies implement critical SEO Strategies. Show all articles from Kevin Indig.
Source
https://blog.searchmetrics.com/us/2015/02/04/how-to-build-an-seo-team-to-capture-your-market/
0 notes
Text
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
Posted by Tom.Capper
If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)
I’m going to focus on GA (Google Analytics), as it's the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.
Side note: Our test setup (multiple trackers & customized GA)
On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.
(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)
Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).
This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.
Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.
Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/
Overall, this table summarizes our setups:
Tracker
Renamed function?
GTM or on-page?
Locally hosted JavaScript file?
Default
No
GTM HTML tag
No
FredTheUnblockable
Yes - “tcap”
GTM HTML tag
Yes
AlbertTheImmutable
Yes - “buffoon”
On page
Yes
DianaTheIndefatigable
No
On page
No
I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:
Reason 1: Ad Blockers
Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.
Effect of ad blockers
Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.
Here’s how Distilled’s setups fared:
(All numbers shown are from April 2018)
Setup
Vs. Adblock
Vs. Adblock with “EasyPrivacy” enabled
Vs. uBlock Origin
GTM
Pass
Fail
Fail
On page
Pass
Fail
Fail
GTM + renamed script & function
Pass
Fail
Fail
On page + renamed script & function
Pass
Fail
Fail
Seems like those tweaked setups didn’t do much!
Lost data due to ad blockers: ~10%
Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.
Reason 2: Browser “do not track”
This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.
Effect of “do not track”
Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.
Setup
Chrome “do not track”
Firefox “do not track”
Firefox “tracking protection”
GTM
Pass
Pass
Fail
On page
Pass
Pass
Fail
GTM + renamed script & function
Pass
Pass
Fail
On page + renamed script & function
Pass
Pass
Fail
Again, it doesn’t seem that the tweaked setups are doing much work for us here.
Lost data due to “do not track”: <1%
Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.
Reason 3: Filters
It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.
For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.
Lost data due to filters: ???
Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.
Reason 4: GTM vs. on-page vs. misplaced on-page
Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.
I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.
By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.
Effect of GTM and misplaced on-page code
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Chrome
100.00%
98.75%
100.77%
99.80%
94.75%
Safari
100.00%
99.42%
100.55%
102.08%
82.69%
Firefox
100.00%
99.71%
101.16%
101.45%
90.68%
Internet Explorer
100.00%
80.06%
112.31%
113.37%
77.18%
There are a few main takeaways here:
On-page code generally reports more traffic than GTM
Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.
It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.
I also split the data by mobile, out of curiosity:
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Desktop
100.00%
98.31%
100.97%
100.89%
93.47%
Mobile
100.00%
97.00%
103.78%
100.42%
89.87%
Tablet
100.00%
97.68%
104.20%
102.43%
88.13%
The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.
Lost data due to GTM: 1–5%
Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.
Lost data due to misplaced on-page code: ~10%
On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.
Bonus round: Missing data from channels
I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.
Dark traffic
Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:
Untagged campaigns in email
Untagged campaigns in apps (especially Facebook, Twitter, etc.)
Misrepresented organic
Data sent from botched tracking implementations (which can also appear as self-referrals)
It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.
Attribution
I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.
Discussion
I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
https://ift.tt/2q13Myy xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B xem thêm tại: https://ift.tt/2mb4VST để biết thêm về địa chỉ bán tai nghe không dây giá rẻ How Much Data Is Missing from Analytics? And Other Analytics Black Holes https://ift.tt/2GWKq1B Bạn có thể xem thêm địa chỉ mua tai nghe không dây tại đây https://ift.tt/2mb4VST
0 notes
isearchgoood · 6 years
Text
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
Posted by Tom.Capper
If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)
I’m going to focus on GA (Google Analytics), as it's the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.
Side note: Our test setup (multiple trackers & customized GA)
On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.
(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)
Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).
This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.
Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.
Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/
Overall, this table summarizes our setups:
Tracker
Renamed function?
GTM or on-page?
Locally hosted JavaScript file?
Default
No
GTM HTML tag
No
FredTheUnblockable
Yes - “tcap”
GTM HTML tag
Yes
AlbertTheImmutable
Yes - “buffoon”
On page
Yes
DianaTheIndefatigable
No
On page
No
I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:
Reason 1: Ad Blockers
Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.
Effect of ad blockers
Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.
Here’s how Distilled’s setups fared:
(All numbers shown are from April 2018)
Setup
Vs. Adblock
Vs. Adblock with “EasyPrivacy” enabled
Vs. uBlock Origin
GTM
Pass
Fail
Fail
On page
Pass
Fail
Fail
GTM + renamed script & function
Pass
Fail
Fail
On page + renamed script & function
Pass
Fail
Fail
Seems like those tweaked setups didn’t do much!
Lost data due to ad blockers: ~10%
Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.
Reason 2: Browser “do not track”
This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.
Effect of “do not track”
Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.
Setup
Chrome “do not track”
Firefox “do not track”
Firefox “tracking protection”
GTM
Pass
Pass
Fail
On page
Pass
Pass
Fail
GTM + renamed script & function
Pass
Pass
Fail
On page + renamed script & function
Pass
Pass
Fail
Again, it doesn’t seem that the tweaked setups are doing much work for us here.
Lost data due to “do not track”: <1%
Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.
Reason 3: Filters
It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.
For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.
Lost data due to filters: ???
Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.
Reason 4: GTM vs. on-page vs. misplaced on-page
Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.
I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.
By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.
Effect of GTM and misplaced on-page code
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Chrome
100.00%
98.75%
100.77%
99.80%
94.75%
Safari
100.00%
99.42%
100.55%
102.08%
82.69%
Firefox
100.00%
99.71%
101.16%
101.45%
90.68%
Internet Explorer
100.00%
80.06%
112.31%
113.37%
77.18%
There are a few main takeaways here:
On-page code generally reports more traffic than GTM
Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.
It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.
I also split the data by mobile, out of curiosity:
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Desktop
100.00%
98.31%
100.97%
100.89%
93.47%
Mobile
100.00%
97.00%
103.78%
100.42%
89.87%
Tablet
100.00%
97.68%
104.20%
102.43%
88.13%
The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.
Lost data due to GTM: 1–5%
Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.
Lost data due to misplaced on-page code: ~10%
On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.
Bonus round: Missing data from channels
I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.
Dark traffic
Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:
Untagged campaigns in email
Untagged campaigns in apps (especially Facebook, Twitter, etc.)
Misrepresented organic
Data sent from botched tracking implementations (which can also appear as self-referrals)
It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.
Attribution
I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.
Discussion
I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
via Blogger https://ift.tt/2kwAy68 #blogger #bloggingtips #bloggerlife #bloggersgetsocial #ontheblog #writersofinstagram #writingprompt #instapoetry #writerscommunity #writersofig #writersblock #writerlife #writtenword #instawriters #spilledink #wordgasm #creativewriting #poetsofinstagram #blackoutpoetry #poetsofig
0 notes
lawrenceseitz22 · 6 years
Text
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
Posted by Tom.Capper
If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)
I’m going to focus on GA (Google Analytics), as it's the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.
Side note: Our test setup (multiple trackers & customized GA)
On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.
(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)
Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).
This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.
Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.
Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/
Overall, this table summarizes our setups:
Tracker
Renamed function?
GTM or on-page?
Locally hosted JavaScript file?
Default
No
GTM HTML tag
No
FredTheUnblockable
Yes - “tcap”
GTM HTML tag
Yes
AlbertTheImmutable
Yes - “buffoon”
On page
Yes
DianaTheIndefatigable
No
On page
No
I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:
Reason 1: Ad Blockers
Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.
Effect of ad blockers
Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.
Here’s how Distilled’s setups fared:
(All numbers shown are from April 2018)
Setup
Vs. Adblock
Vs. Adblock with “EasyPrivacy” enabled
Vs. uBlock Origin
GTM
Pass
Fail
Fail
On page
Pass
Fail
Fail
GTM + renamed script & function
Pass
Fail
Fail
On page + renamed script & function
Pass
Fail
Fail
Seems like those tweaked setups didn’t do much!
Lost data due to ad blockers: ~10%
Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.
Reason 2: Browser “do not track”
This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.
Effect of “do not track”
Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.
Setup
Chrome “do not track”
Firefox “do not track”
Firefox “tracking protection”
GTM
Pass
Pass
Fail
On page
Pass
Pass
Fail
GTM + renamed script & function
Pass
Pass
Fail
On page + renamed script & function
Pass
Pass
Fail
Again, it doesn’t seem that the tweaked setups are doing much work for us here.
Lost data due to “do not track”: <1%
Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.
Reason 3: Filters
It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.
For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.
Lost data due to filters: ???
Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.
Reason 4: GTM vs. on-page vs. misplaced on-page
Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.
I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.
By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.
Effect of GTM and misplaced on-page code
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Chrome
100.00%
98.75%
100.77%
99.80%
94.75%
Safari
100.00%
99.42%
100.55%
102.08%
82.69%
Firefox
100.00%
99.71%
101.16%
101.45%
90.68%
Internet Explorer
100.00%
80.06%
112.31%
113.37%
77.18%
There are a few main takeaways here:
On-page code generally reports more traffic than GTM
Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.
It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.
I also split the data by mobile, out of curiosity:
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Desktop
100.00%
98.31%
100.97%
100.89%
93.47%
Mobile
100.00%
97.00%
103.78%
100.42%
89.87%
Tablet
100.00%
97.68%
104.20%
102.43%
88.13%
The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.
Lost data due to GTM: 1–5%
Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.
Lost data due to misplaced on-page code: ~10%
On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.
Bonus round: Missing data from channels
I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.
Dark traffic
Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:
Untagged campaigns in email
Untagged campaigns in apps (especially Facebook, Twitter, etc.)
Misrepresented organic
Data sent from botched tracking implementations (which can also appear as self-referrals)
It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.
Attribution
I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.
Discussion
I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from Blogger https://ift.tt/2KZaOKK via IFTTT
0 notes
nereomata · 6 years
Text
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
Posted by Tom.Capper
If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)
I’m going to focus on GA (Google Analytics), as it's the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.
Side note: Our test setup (multiple trackers & customized GA)
On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.
(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)
Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).
This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.
Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.
Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/
Overall, this table summarizes our setups:
Tracker
Renamed function?
GTM or on-page?
Locally hosted JavaScript file?
Default
No
GTM HTML tag
No
FredTheUnblockable
Yes - “tcap”
GTM HTML tag
Yes
AlbertTheImmutable
Yes - “buffoon”
On page
Yes
DianaTheIndefatigable
No
On page
No
I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:
Reason 1: Ad Blockers
Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.
Effect of ad blockers
Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.
Here’s how Distilled’s setups fared:
(All numbers shown are from April 2018)
Setup
Vs. Adblock
Vs. Adblock with “EasyPrivacy” enabled
Vs. uBlock Origin
GTM
Pass
Fail
Fail
On page
Pass
Fail
Fail
GTM + renamed script & function
Pass
Fail
Fail
On page + renamed script & function
Pass
Fail
Fail
Seems like those tweaked setups didn’t do much!
Lost data due to ad blockers: ~10%
Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.
Reason 2: Browser “do not track”
This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.
Effect of “do not track”
Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.
Setup
Chrome “do not track”
Firefox “do not track”
Firefox “tracking protection”
GTM
Pass
Pass
Fail
On page
Pass
Pass
Fail
GTM + renamed script & function
Pass
Pass
Fail
On page + renamed script & function
Pass
Pass
Fail
Again, it doesn’t seem that the tweaked setups are doing much work for us here.
Lost data due to “do not track”: <1%
Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.
Reason 3: Filters
It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.
For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.
Lost data due to filters: ???
Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.
Reason 4: GTM vs. on-page vs. misplaced on-page
Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.
I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.
By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.
Effect of GTM and misplaced on-page code
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Chrome
100.00%
98.75%
100.77%
99.80%
94.75%
Safari
100.00%
99.42%
100.55%
102.08%
82.69%
Firefox
100.00%
99.71%
101.16%
101.45%
90.68%
Internet Explorer
100.00%
80.06%
112.31%
113.37%
77.18%
There are a few main takeaways here:
On-page code generally reports more traffic than GTM
Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.
It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.
I also split the data by mobile, out of curiosity:
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Desktop
100.00%
98.31%
100.97%
100.89%
93.47%
Mobile
100.00%
97.00%
103.78%
100.42%
89.87%
Tablet
100.00%
97.68%
104.20%
102.43%
88.13%
The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.
Lost data due to GTM: 1–5%
Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.
Lost data due to misplaced on-page code: ~10%
On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.
Bonus round: Missing data from channels
I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.
Dark traffic
Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:
Untagged campaigns in email
Untagged campaigns in apps (especially Facebook, Twitter, etc.)
Misrepresented organic
Data sent from botched tracking implementations (which can also appear as self-referrals)
It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.
Attribution
I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.
Discussion
I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from Moz Blog https://moz.com/blog/analytics-black-holes via IFTTT from IM Local SEO Blog http://imlocalseo.blogspot.com/2018/05/how-much-data-is-missing-from-analytics.html via IFTTT from Blogger http://nereomata.blogspot.com/2018/05/how-much-data-is-missing-from-analytics.html via IFTTT
0 notes
swunlimitednj · 6 years
Text
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
Posted by Tom.Capper
If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)
I’m going to focus on GA (Google Analytics), as it's the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.
Side note: Our test setup (multiple trackers & customized GA)
On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.
(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)
Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).
This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.
Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.
Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/
Overall, this table summarizes our setups:
Tracker
Renamed function?
GTM or on-page?
Locally hosted JavaScript file?
Default
No
GTM HTML tag
No
FredTheUnblockable
Yes - “tcap”
GTM HTML tag
Yes
AlbertTheImmutable
Yes - “buffoon”
On page
Yes
DianaTheIndefatigable
No
On page
No
I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:
Reason 1: Ad Blockers
Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.
Effect of ad blockers
Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.
Here’s how Distilled’s setups fared:
(All numbers shown are from April 2018)
Setup
Vs. Adblock
Vs. Adblock with “EasyPrivacy” enabled
Vs. uBlock Origin
GTM
Pass
Fail
Fail
On page
Pass
Fail
Fail
GTM + renamed script & function
Pass
Fail
Fail
On page + renamed script & function
Pass
Fail
Fail
Seems like those tweaked setups didn’t do much!
Lost data due to ad blockers: ~10%
Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.
Reason 2: Browser “do not track”
This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.
Effect of “do not track”
Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.
Setup
Chrome “do not track”
Firefox “do not track”
Firefox “tracking protection”
GTM
Pass
Pass
Fail
On page
Pass
Pass
Fail
GTM + renamed script & function
Pass
Pass
Fail
On page + renamed script & function
Pass
Pass
Fail
Again, it doesn’t seem that the tweaked setups are doing much work for us here.
Lost data due to “do not track”: <1%
Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.
Reason 3: Filters
It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.
For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.
Lost data due to filters: ???
Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.
Reason 4: GTM vs. on-page vs. misplaced on-page
Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.
I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.
By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.
Effect of GTM and misplaced on-page code
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Chrome
100.00%
98.75%
100.77%
99.80%
94.75%
Safari
100.00%
99.42%
100.55%
102.08%
82.69%
Firefox
100.00%
99.71%
101.16%
101.45%
90.68%
Internet Explorer
100.00%
80.06%
112.31%
113.37%
77.18%
There are a few main takeaways here:
On-page code generally reports more traffic than GTM
Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.
It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.
I also split the data by mobile, out of curiosity:
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Desktop
100.00%
98.31%
100.97%
100.89%
93.47%
Mobile
100.00%
97.00%
103.78%
100.42%
89.87%
Tablet
100.00%
97.68%
104.20%
102.43%
88.13%
The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.
Lost data due to GTM: 1–5%
Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.
Lost data due to misplaced on-page code: ~10%
On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.
Bonus round: Missing data from channels
I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.
Dark traffic
Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:
Untagged campaigns in email
Untagged campaigns in apps (especially Facebook, Twitter, etc.)
Misrepresented organic
Data sent from botched tracking implementations (which can also appear as self-referrals)
It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.
Attribution
I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.
Discussion
I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from Blogger https://ift.tt/2J9fNey via SW Unlimited
0 notes
rodneyevesuarywk · 6 years
Text
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
Posted by Tom.Capper
If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)
I’m going to focus on GA (Google Analytics), as it's the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.
Side note: Our test setup (multiple trackers & customized GA)
On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.
(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)
Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).
This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.
Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.
Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/
Overall, this table summarizes our setups:
Tracker
Renamed function?
GTM or on-page?
Locally hosted JavaScript file?
Default
No
GTM HTML tag
No
FredTheUnblockable
Yes - “tcap”
GTM HTML tag
Yes
AlbertTheImmutable
Yes - “buffoon”
On page
Yes
DianaTheIndefatigable
No
On page
No
I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:
Reason 1: Ad Blockers
Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.
Effect of ad blockers
Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.
Here’s how Distilled’s setups fared:
(All numbers shown are from April 2018)
Setup
Vs. Adblock
Vs. Adblock with “EasyPrivacy” enabled
Vs. uBlock Origin
GTM
Pass
Fail
Fail
On page
Pass
Fail
Fail
GTM + renamed script & function
Pass
Fail
Fail
On page + renamed script & function
Pass
Fail
Fail
Seems like those tweaked setups didn’t do much!
Lost data due to ad blockers: ~10%
Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.
Reason 2: Browser “do not track”
This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.
Effect of “do not track”
Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.
Setup
Chrome “do not track”
Firefox “do not track”
Firefox “tracking protection”
GTM
Pass
Pass
Fail
On page
Pass
Pass
Fail
GTM + renamed script & function
Pass
Pass
Fail
On page + renamed script & function
Pass
Pass
Fail
Again, it doesn’t seem that the tweaked setups are doing much work for us here.
Lost data due to “do not track”: <1%
Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.
Reason 3: Filters
It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.
For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.
Lost data due to filters: ???
Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.
Reason 4: GTM vs. on-page vs. misplaced on-page
Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.
I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.
By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.
Effect of GTM and misplaced on-page code
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Chrome
100.00%
98.75%
100.77%
99.80%
94.75%
Safari
100.00%
99.42%
100.55%
102.08%
82.69%
Firefox
100.00%
99.71%
101.16%
101.45%
90.68%
Internet Explorer
100.00%
80.06%
112.31%
113.37%
77.18%
There are a few main takeaways here:
On-page code generally reports more traffic than GTM
Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.
It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.
I also split the data by mobile, out of curiosity:
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Desktop
100.00%
98.31%
100.97%
100.89%
93.47%
Mobile
100.00%
97.00%
103.78%
100.42%
89.87%
Tablet
100.00%
97.68%
104.20%
102.43%
88.13%
The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.
Lost data due to GTM: 1–5%
Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.
Lost data due to misplaced on-page code: ~10%
On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.
Bonus round: Missing data from channels
I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.
Dark traffic
Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:
Untagged campaigns in email
Untagged campaigns in apps (especially Facebook, Twitter, etc.)
Misrepresented organic
Data sent from botched tracking implementations (which can also appear as self-referrals)
It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.
Attribution
I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.
Discussion
I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
https://ift.tt/2LCPWKo
0 notes
conniecogeie · 6 years
Text
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
Posted by Tom.Capper
If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)
I’m going to focus on GA (Google Analytics), as it's the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.
Side note: Our test setup (multiple trackers & customized GA)
On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.
(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)
Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).
This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.
Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.
Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/
Overall, this table summarizes our setups:
Tracker
Renamed function?
GTM or on-page?
Locally hosted JavaScript file?
Default
No
GTM HTML tag
No
FredTheUnblockable
Yes - “tcap”
GTM HTML tag
Yes
AlbertTheImmutable
Yes - “buffoon”
On page
Yes
DianaTheIndefatigable
No
On page
No
I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:
Reason 1: Ad Blockers
Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.
Effect of ad blockers
Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.
Here’s how Distilled’s setups fared:
(All numbers shown are from April 2018)
Setup
Vs. Adblock
Vs. Adblock with “EasyPrivacy” enabled
Vs. uBlock Origin
GTM
Pass
Fail
Fail
On page
Pass
Fail
Fail
GTM + renamed script & function
Pass
Fail
Fail
On page + renamed script & function
Pass
Fail
Fail
Seems like those tweaked setups didn’t do much!
Lost data due to ad blockers: ~10%
Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.
Reason 2: Browser “do not track”
This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.
Effect of “do not track”
Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.
Setup
Chrome “do not track”
Firefox “do not track”
Firefox “tracking protection”
GTM
Pass
Pass
Fail
On page
Pass
Pass
Fail
GTM + renamed script & function
Pass
Pass
Fail
On page + renamed script & function
Pass
Pass
Fail
Again, it doesn’t seem that the tweaked setups are doing much work for us here.
Lost data due to “do not track”: <1%
Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.
Reason 3: Filters
It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.
For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.
Lost data due to filters: ???
Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.
Reason 4: GTM vs. on-page vs. misplaced on-page
Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.
I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.
By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.
Effect of GTM and misplaced on-page code
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Chrome
100.00%
98.75%
100.77%
99.80%
94.75%
Safari
100.00%
99.42%
100.55%
102.08%
82.69%
Firefox
100.00%
99.71%
101.16%
101.45%
90.68%
Internet Explorer
100.00%
80.06%
112.31%
113.37%
77.18%
There are a few main takeaways here:
On-page code generally reports more traffic than GTM
Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.
It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.
I also split the data by mobile, out of curiosity:
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Desktop
100.00%
98.31%
100.97%
100.89%
93.47%
Mobile
100.00%
97.00%
103.78%
100.42%
89.87%
Tablet
100.00%
97.68%
104.20%
102.43%
88.13%
The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.
Lost data due to GTM: 1–5%
Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.
Lost data due to misplaced on-page code: ~10%
On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.
Bonus round: Missing data from channels
I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.
Dark traffic
Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:
Untagged campaigns in email
Untagged campaigns in apps (especially Facebook, Twitter, etc.)
Misrepresented organic
Data sent from botched tracking implementations (which can also appear as self-referrals)
It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.
Attribution
I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.
Discussion
I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
https://ift.tt/2LCPWKo
0 notes
christinesumpmg1 · 6 years
Text
How Much Data Is Missing from Analytics? And Other Analytics Black Holes
Posted by Tom.Capper
If you’ve ever compared two analytics implementations on the same site, or compared your analytics with what your business is reporting in sales, you’ve probably noticed that things don’t always match up. In this post, I’ll explain why data is missing from your web analytics platforms and how large the impact could be. Some of the issues I cover are actually quite easily addressed, and have a decent impact on traffic — there’s never been an easier way to hit your quarterly targets. ;)
I’m going to focus on GA (Google Analytics), as it's the most commonly used provider, but most on-page analytics platforms have the same issues. Platforms that rely on server logs do avoid some issues but are fairly rare, so I won’t cover them in any depth.
Side note: Our test setup (multiple trackers & customized GA)
On Distilled.net, we have a standard Google Analytics property running from an HTML tag in GTM (Google Tag Manager). In addition, for the last two years, I’ve been running three extra concurrent Google Analytics implementations, designed to measure discrepancies between different configurations.
(If you’re just interested in my findings, you can skip this section, but if you want to hear more about the methodology, continue reading. Similarly, don’t worry if you don’t understand some of the detail here — the results are easier to follow.)
Two of these extra implementations — one in Google Tag Manager and one on page — run locally hosted, renamed copies of the Google Analytics JavaScript file (e.g. www.distilled.net/static/js/au3.js, instead of www.google-analytics.com/analytics.js) to make them harder to spot for ad blockers. I also used renamed JavaScript functions (“tcap” and “Buffoon,” rather than the standard “ga”) and renamed trackers (“FredTheUnblockable” and “AlbertTheImmutable”) to avoid having duplicate trackers (which can often cause issues).
This was originally inspired by 2016-era best practice on how to get your Google Analytics setup past ad blockers. I can’t find the original article now, but you can see a very similar one from 2017 here.
Lastly, we have (“DianaTheIndefatigable”), which just has a renamed tracker, but uses the standard code otherwise and is implemented on-page. This is to complete the set of all combinations of modified and unmodified GTM and on-page trackers.
Two of Distilled’s modified on-page trackers, as seen on https://www.distilled.net/
Overall, this table summarizes our setups:
Tracker
Renamed function?
GTM or on-page?
Locally hosted JavaScript file?
Default
No
GTM HTML tag
No
FredTheUnblockable
Yes - “tcap”
GTM HTML tag
Yes
AlbertTheImmutable
Yes - “buffoon”
On page
Yes
DianaTheIndefatigable
No
On page
No
I tested their functionality in various browser/ad-block environments by watching for the pageviews appearing in browser developer tools:
Reason 1: Ad Blockers
Ad blockers, primarily as browser extensions, have been growing in popularity for some time now. Primarily this has been to do with users looking for better performance and UX on ad-laden sites, but in recent years an increased emphasis on privacy has also crept in, hence the possibility of analytics blocking.
Effect of ad blockers
Some ad blockers block web analytics platforms by default, others can be configured to do so. I tested Distilled’s site with Adblock Plus and uBlock Origin, two of the most popular ad-blocking desktop browser addons, but it’s worth noting that ad blockers are increasingly prevalent on smartphones, too.
Here’s how Distilled’s setups fared:
(All numbers shown are from April 2018)
Setup
Vs. Adblock
Vs. Adblock with “EasyPrivacy” enabled
Vs. uBlock Origin
GTM
Pass
Fail
Fail
On page
Pass
Fail
Fail
GTM + renamed script & function
Pass
Fail
Fail
On page + renamed script & function
Pass
Fail
Fail
Seems like those tweaked setups didn’t do much!
Lost data due to ad blockers: ~10%
Ad blocker usage can be in the 15–25% range depending on region, but many of these installs will be default setups of AdBlock Plus, which as we’ve seen above, does not block tracking. Estimates of AdBlock Plus’s market share among ad blockers vary from 50–70%, with more recent reports tending more towards the former. So, if we assume that at most 50% of installed ad blockers block analytics, that leaves your exposure at around 10%.
Reason 2: Browser “do not track”
This is another privacy motivated feature, this time of browsers themselves. You can enable it in the settings of most current browsers. It’s not compulsory for sites or platforms to obey the “do not track” request, but Firefox offers a stronger feature under the same set of options, which I decided to test as well.
Effect of “do not track”
Most browsers now offer the option to send a “Do not track” message. I tested the latest releases of Firefox & Chrome for Windows 10.
Setup
Chrome “do not track”
Firefox “do not track”
Firefox “tracking protection”
GTM
Pass
Pass
Fail
On page
Pass
Pass
Fail
GTM + renamed script & function
Pass
Pass
Fail
On page + renamed script & function
Pass
Pass
Fail
Again, it doesn’t seem that the tweaked setups are doing much work for us here.
Lost data due to “do not track”: <1%
Only Firefox Quantum’s “Tracking Protection,” introduced in February, had any effect on our trackers. Firefox has a 5% market share, but Tracking Protection is not enabled by default. The launch of this feature had no effect on the trend for Firefox traffic on Distilled.net.
Reason 3: Filters
It’s a bit of an obvious one, but filters you’ve set up in your analytics might intentionally or unintentionally reduce your reported traffic levels.
For example, a filter excluding certain niche screen resolutions that you believe to be mostly bots, or internal traffic, will obviously cause your setup to underreport slightly.
Lost data due to filters: ???
Impact is hard to estimate, as setup will obviously vary on a site-by site-basis. I do recommend having a duplicate, unfiltered “master” view in case you realize too late you’ve lost something you didn’t intend to.
Reason 4: GTM vs. on-page vs. misplaced on-page
Google Tag Manager has become an increasingly popular way of implementing analytics in recent years, due to its increased flexibility and the ease of making changes. However, I’ve long noticed that it can tend to underreport vs. on-page setups.
I was also curious about what would happen if you didn’t follow Google’s guidelines in setting up on-page code.
By combining my numbers with numbers from my colleague Dom Woodman’s site (you’re welcome for the link, Dom), which happens to use a Drupal analytics add-on as well as GTM, I was able to see the difference between Google Tag Manager and misplaced on-page code (right at the bottom of the <body> tag) I then weighted this against my own Google Tag Manager data to get an overall picture of all 5 setups.
Effect of GTM and misplaced on-page code
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Chrome
100.00%
98.75%
100.77%
99.80%
94.75%
Safari
100.00%
99.42%
100.55%
102.08%
82.69%
Firefox
100.00%
99.71%
101.16%
101.45%
90.68%
Internet Explorer
100.00%
80.06%
112.31%
113.37%
77.18%
There are a few main takeaways here:
On-page code generally reports more traffic than GTM
Modified code is generally within a margin of error, apart from modified GTM code on Internet Explorer (see note below)
Misplaced analytics code will cost you up to a third of your traffic vs. properly implemented on-page code, depending on browser (!)
The customized setups, which are designed to get more traffic by evading ad blockers, are doing nothing of the sort.
It’s worth noting also that the customized implementations actually got less traffic than the standard ones. For the on-page code, this is within the margin of error, but for Google Tag Manager, there’s another reason — because I used unfiltered profiles for the comparison, there’s a lot of bot spam in the main profile, which primarily masquerades as Internet Explorer. Our main profile is by far the most spammed, and also acting as the baseline here, so the difference between on-page code and Google Tag Manager is probably somewhat larger than what I’m reporting.
I also split the data by mobile, out of curiosity:
Traffic as a percentage of baseline (standard Google Tag Manager implementation):
Google Tag Manager
Modified & Google Tag Manager
On-Page Code In <head>
Modified & On-Page Code In <head>
On-Page Code Misplaced In <Body>
Desktop
100.00%
98.31%
100.97%
100.89%
93.47%
Mobile
100.00%
97.00%
103.78%
100.42%
89.87%
Tablet
100.00%
97.68%
104.20%
102.43%
88.13%
The further takeaway here seems to be that mobile browsers, like Internet Explorer, can struggle with Google Tag Manager.
Lost data due to GTM: 1–5%
Google Tag Manager seems to cost you a varying amount depending on what make-up of browsers and devices use your site. On Distilled.net, the difference is around 1.7%; however, we have an unusually desktop-heavy and tech-savvy audience (not much Internet Explorer!). Depending on vertical, this could easily swell to the 5% range.
Lost data due to misplaced on-page code: ~10%
On Teflsearch.com, the impact of misplaced on-page code was around 7.5%, vs Google Tag Manager. Keeping in mind that Google Tag Manager itself underreports, the total loss could easily be in the 10% range.
Bonus round: Missing data from channels
I’ve focused above on areas where you might be missing data altogether. However, there are also lots of ways in which data can be misrepresented, or detail can be missing. I’ll cover these more briefly, but the main issues are dark traffic and attribution.
Dark traffic
Dark traffic is direct traffic that didn’t really come via direct — which is generally becoming more and more common. Typical causes are:
Untagged campaigns in email
Untagged campaigns in apps (especially Facebook, Twitter, etc.)
Misrepresented organic
Data sent from botched tracking implementations (which can also appear as self-referrals)
It’s also worth noting the trend towards genuinely direct traffic that would historically have been organic. For example, due to increasingly sophisticated browser autocompletes, cross-device history, and so on, people end up “typing” a URL that they’d have searched for historically.
Attribution
I’ve written about this in more detail here, but in general, a session in Google Analytics (and any other platform) is a fairly arbitrary construct — you might think it’s obvious how a group of hits should be grouped into one or more sessions, but in fact, the process relies on a number of fairly questionable assumptions. In particular, it’s worth noting that Google Analytics generally attributes direct traffic (including dark traffic) to the previous non-direct source, if one exists.
Discussion
I was quite surprised by some of my own findings when researching this post, but I’m sure I didn’t get everything. Can you think of any other ways in which data can end up missing from analytics?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
https://ift.tt/2LCPWKo
0 notes