Tumgik
geeksperhour · 3 years
Link
via Screaming Frog
10 Years ago we released the Screaming Frog SEO Spider, without any real expectation. It was built for fun and with love, mostly in evenings and weekends to be used just by me originally. It solved problems I was having, and then we thought it might help others too.
It turns out, it did. Despite it looking like this.
The simple data-first UI has been completely revolutionised, by adding MOAR tabs.
Fast-forward to 2020 and we’ve grown to an amazing team of around 40 (from agency and software), with hundreds of thousands of users worldwide having been the first to market with features like JavaScript rendering, structured data validation, and we built an SEO Log File Analyser (for the 3 SEOs who manage to get them from clients). Some of the biggest brands, agencies and even some search engines use our software.
There’s been so many challenges over the past 10 years, but we’ve remained focused on listening to users, building the product and features they want, and providing the best support available. This has always been natural, as we live the same life as our users (as SEOs ourselves), experiencing the same day-to-day pain points. It’s been an amazing journey and you can read about our story here.
While I am not one for great reflection just yet as the story continues, I just hope we’ve played a small role in making SEOs lives easier, and the industry do their jobs a bit better.
Most importantly, we could not have done it without the support of the SEO community. It’s unique. The feature requests, feedback and support we’ve received from all over the world has been incredible, so thank you to everyone who has played a role in the direction, development and continued evolution.
OG Hoodie Giveaway
Rather than ’10 Things I’ve Learnt In 10 Years’ or a coordinated dance routine with the team, what better way to celebrate 10 years, than giveaway 10 year old Screaming Frog hoodies?
While these haven’t really been loitering in the swag loft (attic for those outside the UK) for a decade, they are the original design and hoodies we first made and gave out to the team and close friends many years ago. Available only in black.
Here’s some of our team picking them up from SF HQ recently, while practicing social distancing.
We’ll also send you some SF stickers for your laptop.
We ran out of laptops.
How To Win
In total we have 100 OG hoodies to giveaway. All you need to do to have a chance to win is leave a comment below with the feature you’d like to see next.
While unique and interesting features are appreciated, the winners will be randomly selected and we will be in touch with regards to sizes and sending them to you – so do make sure you use your real email in the comment. You can be anywhere in the world, which has international post from the UK to enter. The giveaway is one entry per person.
You don’t need to worry about providing us with your email address, we promise not to use it for anything promotional; we are just giving away super cool hoodies.
The competition will end in 7 days’ time on the 16th of November ’20. Good luck everyone!
The post Screaming Frog 10 Year Anniversary Giveaway appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
Lies, damned lies, and statistics. This pithy quote sums up most people’s attitude to data. It’s either untrustworthy, unreliable, or just plain boring.
But in the right hands data can be turned into stories. Stories that captivate. Stories that excite. Stories that get links from top tier publications.
This post will explore how to find great data online, how to use that data to spark an idea, and perhaps most importantly, how not to be wrong when analysing the data.
We’ll also take a brief look at data visualisation techniques, but that really deserves a whole post of its own. Probably from someone with more graphic design experience.
Data Sources
There’s a lot of data online. It’s estimated that Google, Amazon, Microsoft and Facebook store at least 1.2 million terabytes between them. But that’s just the tip of the iceberg. The amount of data on the entire internet is thought to be in the order of zettabytes. That’s a one with 21 zeroes.
To even begin to comprehend how much information this is, imagine streaming a Netflix movie this size. With a broadband speed of 100 Gbps, this would take you 2,535 years. That’s a lot of popcorn.
So, finding the right data is tricky. Thankfully an excellent marketer has put a guide together for you.
For most of us, the first step is likely Google. Many datasets can be found this way, especially those from governments or public bodies. There are a few things to keep in mind when Googling for data:
Use the word “historical” when searching for older data
Use the word “data” in your search
Keep your initial search broad (crime data London) to see the options, then explore the different sources (data.london.gov, Met Police, ONS etc.). I prefer to do this in separate tabs, keeping the main search results page open too
Once you know what you’re looking for, use an advanced Google search for .xls or .xlsx files or government websites
filetype:xls / xlsx / pdf etc.
site:gov.uk
Google also has a separate search engine specifically for datasets, which is definitely worth a visit.
Finally, not all data is easily accessible this way, especially the more obscure ones. Jeremy Singer-Vine, the data editor at Buzzfeed News, has a weekly newsletter for precisely this reason.
Most weeks, he sends 4 or 5 interesting datasets or data stories. These range from every place name in the United States, to one artist’s 48 first kisses.
The best part is that the archive of all the datasets is available to view at any time, saving you the trouble of hunting through your inbox for the email you know mentioned where walruses like to hangout.
Using Data to Spark an Idea
We’ve written about ideation before. This process, which I like to call data-driven ideation, is slightly different.
Whereas Tom’s method involves coming up with an idea, then looking for sources to support it, I like to turn it on its head. I look for datasets or data-driven articles relevant to my client’s niche, then think of datasets these questions could answer.
For example, we found a database of all the publicly-owned art in the UK. The questions this could answer are endless, but we focused on one: how many sculptures are made by women?
When thinking of questions to answer, there are two things to keep in mind. One, has the question been answered before? If it hasn’t, great. If it has, would your answer add anything to the conversation?
Data Analysis
Once you have your data source, more often than not you’ll have to do some manipulation to get it how you want it. Governments in particular seem to delight in the awkward Excel spreadsheet formatted the way you wouldn’t expect. Like using columns when rows would make more sense…
Averages
As content marketers/data analysts, the humble average is the metric we’re most likely to work with. But did you know there is more than one type of average?
Mean
The mean is what most people mean when they refer to the average. It’s calculated by adding all the numbers up and dividing by the total number of values. Or by using the AVERAGE formula in Sheets or Excel. It doesn’t cope well with outliers, so if your data is skewed, please move along.
Median
The median is the middle value of a dataset and is calculated by ordering the numbers and finding the middle value. You can also use the MEDIAN formula in your spreadsheet tool of choice.
This measure is more suited to skewed datasets. Say I’m looking for a house in London and I want to know the average price of one. All the oligarchs buying penthouses will skew the mean so high it’ll just depress me. But the median takes into account the much larger number of slightly more reasonably priced houses, leading to a hopefully less depressing number.
Mode
Finally, we come to the mode. This is the most common value in a dataset. If your dataset is numerical, then you can use the MODE formula. If it’s the most common text you’re trying to find, then the formula is slightly more involved, but still accessible.
Outliers
Outliers are results that are either much larger or much smaller than you would expect based on the rest of the data. They can really mess up your analysis if you’re not careful.
If outliers are:
A measurement error or data entry error, you should correct the error if possible. If you can’t fix it, remove that observation because you know it’s incorrect.
Not a part of the population you’re studying (because of unusual properties or conditions), you can remove the outlier.
A natural part of the population you’re studying, you shouldn’t remove it.
Accounting for Population Size
A lot of data-driven content involves ranking areas by one or more metrics, like pint prices or spider sightings. It’s an excellent way to get coverage in a lot of different local publications.
But there are pitfalls when ranking areas by metrics that could be affected by population size, as this oft-referenced XKCD comic highlights.
To remove this issue, we use per capita measurements. This is essentially a fancy Latin way of saying divide your metric by the population of the area it refers to, giving the metric per person.
In most cases, unless you’re dealing with silly numbers like GDP or national debt, this will give you a tiny number. So, to make it more manageable, the convention is to multiply by 100,000. This gives you the metric per 100,000 people in that area. And voila, the largest place no longer wins every time.
How Not to be Wrong
There are many ways of being wrong, but only one way of being right. Here we’ll look at some of the most common pitfalls in data analysis and how you can avoid them.
Picking out a data range that supports a point of view, while ignoring the larger trend
Saying something about a larger group based on a non-representative sample
Using percentage change for small numbers – this is misleading
Correlation doesn’t equal causation. Even if we don’t say something is causing something else, putting two trends next to each other encourages readers to draw that conclusion
Avoid unnecessary accuracy: taking numbers past the decimal point can be deceptive if one number in the calculation is an estimate
Don’t confuse the percentage point difference (40% – 30% = 10 percentage points) with percentage change (40% to 30% is a 25% decrease)
In general, take care when using percentage change with values that are already percentages. This can introduce other errors
Record your steps, including clearly where you got the data from. Hidden sections of websites can be difficult to find
Make sure you’re dividing by the right number in percentages or division
Standardise dates, including breaking them up into day/month/year if necessary
Don’t type things and use formulas wherever possible. Entering data by hand introduces mistakes
Spot check your data after doing large changes
Data-driven content should be rigorous and accurate. Journalists aren’t going to cover, let along link to, something where the numbers have been fudged. Being as careful as possible is the best way to prevent this.
Data Visualisation
Choosing how to visualise the data you’ve so carefully collected and analysed is one of the most important parts of the process. After all, most people are turned off by spreadsheets. Making your data beautiful is another step towards the coverage you want.
What story you want to tell is the main driver behind what chart you choose. For example, a bar chart is best for displaying the number of items in each category, but a line chart is the choice for showing how the data has changed over time.
There are also other, fancier charts if you want to be bolder. Choropleth charts use colour to visualise values over a geographical area, while Sankey diagrams show the transfer of something (energy, money etc.) from one place to another.
For an excellent guide to different chart types and what they’re used for, visit the Data Visualisation Catalogue.
Conclusion
The guide above will help you turn boring old numbers into an exciting content campaign.
To sum up the process, first, find a great dataset library to go back to again and again. Then, when looking for ideas, find relevant datasets and see if they spark ideas or questions to answer.
Getting the data ready to analyse is probably the most boring, but most important, part of the whole process. Use all the Excel/Sheets hacks you know to make analysis as quick and easy as possible.
Finally, choose a suitable visualisation technique and let the links roll in. Well, you’ll have to outreach it first, but that’s a story for another day.
The post How to Turn Data into a Content Marketing Campaign appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
I recently ran into an issue with the WooCommerce connector to Facebook shopping/catalogue feed, where I couldn’t automatically update the feed.
This got me thinking about other ways I can upload a feed to Facebook catalogue and what is the best way to do that? The simplest solution appeared to be Google Sheets, which then posed the next question – how do I get all the information needed for a Facebook feed into a spreadsheet without visiting every page of a website and doing it manually?  It was at this point I remembered I work for Screaming Frog
Tumblr media
.
The first thing I needed to do was figure out what fields are required by Facebook to get a feed up and running. The only fields which are required are as follows:
id
title
description
availability
condition
price
link
image_link
brand, mpn or gtin (include at least one)
To make things easy for those of you trying to recreate my steps, I have created a Google Sheet that you can make a copy of if you wish here –
https://docs.google.com/spreadsheets/d/1WG3a60oK0RVDZTN40QrlGu6dHIMktVTiabsGLI1NZW0/edit?usp=sharing
For the purposes of this blog, I have focussed on the minimum required fields but there are several other fields that may be relevant to you.  For additional information see the Facebook guide here – https://developers.facebook.com/docs/marketing-api/catalog/reference/#da-commerce
Setting up the crawl
Now we have the template in place all we need to do is populate it with all the required information. This is where a licensed copy of the SEO Spider comes in, and in particular, one of the most powerful features custom extraction.
Through the use of this feature on the spider, we can take all the necessary data from the site for every product. There is an excellent custom extraction user guide on the Screaming Frog site to help guide you through any issues.
I have run through an example here to show what the outcome may look like, although depending on how sites are set up there are likely to be different ways of extracting the data you need. There are some different examples of how you extract data using both Regex and XPath as well.
Once you have opened the spider to set up custom extraction, click ‘Config > Custom > Extraction’.
From there you need to pull all the elements you need, that won’t be pulled by default in a crawl such as price, availability, image and brand/mpn/gtin. In this case, I have used regex to extract the data I wanted.
The end result should be a crawl which has all the data from the various attributes you have extracted from the site for each of the products.
Now we need to export the crawl into an excel file and start refining the data to suit our needs. There are some columns that can be auto-filled for example condition is likely to be all ‘new’ so can easily be done. Then it is just a case of copy and pasting the various columns to match the Google spreadsheet headers. There are also a couple of columns that are likely to need some formatting – in particular the price, as this needs to include a 3 digit ISO code after the value, so a simple find and replace to remove any currency symbols and then using a =concat formula should get all your prices in the correct format.
You can then do a check to ensure all the products that you want included are in the list.  It might be you don’t need/want to promote products under £5 for example so you can filter those out etc.
Once you have been able to do that you can copy and paste that data straight into the Google Spreadsheet that you created earlier.
Uploading the feed to Facebook
All that is left to do is to upload the feed into your Catalogue Manager (or Catalog Manager depending on which side of the Pond you are). From Catalogue Manager if you go to Catalogue > Data Sources > Add Products.
Now select ‘Use Bulk Upload’ then select ‘Google Sheets’.
There is a step that you can skip as we have (hopefully) already created the Google sheet, ready to drop in the ‘Enter URL’ box.
You can now select how regularly you want the sheet to be imported. This is likely dependant on how regularly your inventory changes over time.
From there you should have the ability to use the products as you see fit, either through the shop page or tagging your posts with your products.
Hopefully this guide provides everything you need to be able to create a simple Facebook catalogue feed in a short space of time.
Happy cataloguing
Tumblr media
The post How to create a Facebook catalogue feed using the SEO Spider in [time it takes to crawl your site] appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
As content marketers, our goal is to create engaging content to reach relevant audiences.
This relies heavily on getting content covered in the media and securing valuable backlinks to a client’s website.
But getting coverage on high DA news sites isn’t easy. If it was, content marketing agencies wouldn’t exist.
To be successful, content marketers must know what they’re up against. It’s as simple as this: to make the news, you have to read the news.
Here are some top tips for keeping up with the daily news flow and how your campaigns will benefit…
Create a ‘black book’ of news sites
Every client will have a list of ideal news publishers they want to be on and so should you.
Set aside time each morning to read the trending stories across your clients’ industries (whether it’s technology, health, politics, sport, travel etc.).
Absorb the topics they regularly cover, check who’s writing stories, their key angles and the types of spokespeople they quote.
Following these sites’ breaking news accounts (and their top journalists) on Twitter can also speed things up, giving you a snapshot of stories as they break.
Some news outlets (usually B2B titles) will also have a weekly email newsletter you can sign up to, with all their big stories from the week gone by.
Let tools do the hard work
If you’re overwhelmed by the constant drum of the news and have mounting client work to be getting on with, here’s a round-up of the best tools for keeping your finger on the global zeitgeist…
Google Alerts is a basic option, but it’s free and easy to set up. It lets you monitor keywords related to your client and their industry and get all the big news stories straight to your inbox, without lifting a finger. Top tip: select “as it happens” when setting up an alert for instant notifications.
If you want to avoid flooding your inbox with generic search terms, use speech marks to find specific phrases or a plus symbol to look for multiple keywords in the same search. This refines your results to more specific stories. You can also keep up with what your clients’ competitors (and their agencies) are up to by monitoring brand names.
Another easy and often overlooked news aggregator is Google Trending Searches. Choose your country from the drop-down menu and it’ll rank its top trending topics each day based on search volume.
If you have a paid subscription to Buzzsumo, you can use its media monitoring resource – it’s often faster and more reliable than Google Alerts. Go to the “monitoring” section and add your client’s (or their competitor’s) brand name or choose a topic to see all the recent results. You can even ask for real-time updates via your Slack channel.
Buzzsumo also has a useful trending section where you can filter the news by topic like tech, health, sport, politics, education etc.
I’d also recommend content curation tools, like Pocket, which let you save all the stories you’ve been meaning to read but haven’t found time for. With offline functionality, you can catch up with the latest content anytime, anywhere…like in the bath with a glass of vino. Don’t judge me.
Monitor media requests
Media requests are the one time a journalist actually wants to hear from you. They’ll send out a call for contributions among the PR community for a story they’re working on, with the hope your company or one of its experts has what they need. This is usually a product, statistic or top tip.
Monitoring these incoming requests is an important part of securing ‘quick wins’ for content you already have banked, as well as finding out their topic du jour.
You’ll often notice a pattern of journalists asking for similar content. That’s when you know a story or topic has viral potential.
Follow the media requests feed on Gorkana’s homepage, #journorequest on Twitter and sign up to Response Source if you can – a subscription media request service journalists use to reach out to the PR and content marketing community.
Follow the influential
Celebrities and influencers are a major source of news. Journalists often check their social feeds and write up stories based on what they find, so why shouldn’t you?
Stay a step ahead and follow influential people in your client’s field – be it beauty, business or sport. If you can link a piece of content to a developing celebrity-based story, you instantly increase its ‘newsworthiness’.
Learn from ‘newsroom SEO’
In the age of social media and Google domination, it isn’t just clients benefitting from search engine optimisation best practice.
SEO has become a vital weapon for news publishers, helping them grow their share of voice in an overcrowded market, attract valuable traffic flow, and generate precious advertising revenues.
News publishers are now working hard to make their sites and content valuable in the eyes of Google. Follow journalists with these, slightly odd, titles – “SEO Writer”, “Audience Writer” and “Trends Writer”. They’ll be keeping their eye out for the next big thing just like you.
The post To Make The News, Read The News: Why Content Marketers Should Keep Up With The Daily News Flow appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
Speed has long been an official ranking factor, but with the introduction of the Core Web Vitals (CWV), many an SEO might have noticed the ominous Pass/Fail assessment within PageSpeed Insights.
While these metrics aren’t yet used in Googles algorithm, I saw so many URLs failing, it got me wondering. How many well-ranking URLs end up passing the assessment?
2,500 keywords, 20,000 URLs, and just as many graphs later, I may have found the answer.
TL;DR – Across 20,000 URLs:
First Input Delay (FID) on Desktop is negligible with 99% of URLs considered good. And 89% for Mobile.
43% of Mobile and 44% of Desktop URLs had a good Largest Contentful Paint (LCP).
46% of Mobile and 47% of Desktop URLs had a good Cumulative Layout Shift (CLS).
Only 12% of Mobile and 13% of Desktop results passed the CWV assessment (i.e. considered good in all three metrics).
URLs in Position 1 were 10% more likely to pass the CWV assessment than URLs in Position 9.
  Methodology
As Core Web Vitals are evaluated on a per URL basis, I took 2,500 keywords across 100 different topics, scraping all the first-page organic results of each. In total I ended up with about 22,500 URLs. This was duplicated for both mobile and desktop results.
These where then run through the SEO Spider connected to the PageSpeed Insights API, gathering the necessary PSI & CrUX data.
A couple of caveats:
All results were scraped from a search in Berkshire, UK.
No rich result URLs where included.
10th position is excluded as so few SERPs had 10 organic listings, making the sample size considerably lower.
A handful of results had featured snippets. These are classified as position 1 but may not be the ‘true’ 1st position.
Some sites appeared across multiple rankings (e.g. Wikipedia)
Several URLs could not be analysed in PSI for various reasons.
A Bit on Core Web Vitals
For anyone reading who might not be aware of Core Web Vitals – they’re three metrics Google will use to judge page experience. And will become an official ranking factor some time in 2021.
Why? To help push the web forward, encouraging site owners to provide better experiences for users – Aaaand likely helping Google to render the web a bit quicker and more efficiently at the same time. Win-Win.
They’re recorded using real user metrics (rUM) from the Chrome User Experience Report (CrUX). (Google search may also use lab data where CrUX is not available, but the analysis below focuses on rUM). PageSpeed Insights (PSI) then reports on the 75th percentile of this data (25% slowest loads), and classifies them by the following thresholds:
Largest Contentful Paint (LCP): measures loading performance. To provide good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.
First Input Delay (FID): measures interactivity. To provide good user experience, pages should have an FID of less than 100 milliseconds.
Cumulative Layout Shift (CLS): measures visual stability. To provide good user experience, pages should maintain a CLS of less than 0.1.
To pass the Core Web Vitals assessment, a URL needs to be considered ‘good’ in all three metrics.
What Did the Data Highlight?
As suspected only a small proportion of sites ended up passing the CWV assessment – shock! From our list of URLs, only 12% mobile and 13% desktop passed the CWV assessment.
Excluding those without rUM brought this to 23% and 24% respectively.
What’s more interesting is looking at individual pass rates for each ranking position:
URLs in Position 1 had a pass rate of 19% on Mobile and 20% on Desktop. Moving from 1st to 5th saw a 2% decrease per position. The remaining results from 5-9 flattened out to a pass rate of around 10% on Mobile and 11% on Desktop.
So what’s going on here? Have CWVs been a top-secret ranking factor all along?
Very unlikely, but perhaps not far from the truth. From what I’ve noticed it tends to boil down to two aspects-
A major part of the CWV assessment focuses on load speed, which we know is already a ranking factor. Therefore, logic would suggest that quicker sites may rank slightly higher and end up passing the assessment in turn.
However, Google continually comments that speed is a minor factor. Instead, I suspect sites ranking in the first 1-4 positions tend to be better optimised overall. With targeted, rich, and user-friendly content. All while loading this information more efficiently.
Breaking Down the Vitals
We can also view the individual metrics on a more granular level. The following table shows classification across the whole URL sample:
First Input Delay
The FID is negligible, with 89% of Mobile and 99% of Desktop URLs within the good threshold. Averaging at around 56ms on Mobile and 13ms on Desktop.
When comparing against position we get much less of a correlation:
Largest Contentful Paint
LCP saw 43% of Mobile and 44% of Desktop URLs considered good. This averaged out at 3.13s for Mobile and 3.04s for Desktop.
When compared against position, we can see a slight trend. But only 0.14s difference between 1st and 9th:
We can also see this reflected in the pass rates (considered good) for each position:
Cumulative Layout Shift
The CLS pass rates were much higher than I anticipated. As this is usually where we see most sites fail. CLS had 46% of Mobile and 47% of Desktop URLs considered good. This averaged at a CLS of 0.29 on Mobile and 0.25 on Desktop.
This also saw less of a correlation against position, though 1st and 2nd tended to be slightly lower:
When looking at individual pass rates by ranking, we can see a decline in the percentage of ‘good’ URLs as the position moves down the SERP.
First Contentful Paint
Lastly, while it’s not a CWV I also extracted the FCP from CrUX data as another measure of speed. This saw an average of 2.19s on Mobile and 1.99s on Desktop.
While relatively unchanged on desktop, mobile saw a slight increase in load times by position. But only 0.10s between 1st & 9th:
What Can You Take Away from This?
Well, not a whole lot… (sorry). This is still a farily small sample, and Core Web Vitals are not an official ranking factor just yet and won’t be till 2021. Meaning their true impact is yet to be seen.
But if you do happen to load up PageSpeed Insights and see the disheartening ‘fail’ message. Fear not, you’re in good company with most other sites.
Will passing the assessment immediately boost your rankings? Probably not. Should you strive to pass it anyway? Definitely.
Regardless of how much ranking benefit speed and CWV’s provide, having your pages as quick, responsive, and stable as possible is great for users and search engines alike.
If you’re looking to make improvements, PageSpeed Insights is a great place to start. And you can easily grab data across your entire site with the SEO Spider and steps here.
The post How Many Sites Pass the Core Web Vitals Assessment? appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
I got a job offer to be an SEO Consultant for Screaming Frog while I was sat in a rather goofy graduation hat and gown.
My parents then humiliated me by running around telling all the other parents of my uni pals (who we’d literally just met for the first time) about how I now had some fancy job at Google.
Me coping with the realisation that I was going to be an SEO
That was five years ago this month, so now is an excellent time to put that history degree to good use (It cost me enough!) with some musings on how the SEO industry has changed in the last half-decade.
1. Becoming Lord of the SERPs Is Harder Than Ever
Google’s been following their own advice.
The search engine has been gradually expanding the content of results pages with more and more useful features. Their bounce rates must be at an all time low, with users increasingly finding everything they need within the SERP itself.
The age of ’10 blue links’ is long over.
And it’s not just the distracting new bits and pieces they keep adding, such as user-uploaded video-answers to questions about ginger:
Everything you ever wanted to know about ginger. RIP Ginger-Facts.com.
FAQ Mark-up below existing organic listings, as one example, can also be a killer for sites just off the first page:
Yuck.
Roast Agency do a really good list of all the possible SERP features, but honestly my finger aches trying to scroll down all of it, there’s just so many.
Paid placements are also increasingly chipping away at organic real estate- especially on mobile.
The shift in gear from ‘Sponsored Links’ in a big blue box, to the much more concise ‘Ad’ in a yellow box, to ‘Ad’ in a green box that is conveniently the same green as the URL in the snippet; has undoubtedly reduced our organic CTRs.
Does this mean SEO is dead? No.
It just means smart SEO is more necessary than ever- the huge opportunities for traffic and sales are still out there, they just require more brains and budget to access.
2. Link Building Got Wayyyyy Tougher
I have a confession to make.
Not all the content I made for links in 2015 was ground-breaking, newsworthy, data driven #content that shook up the media landscape, went viral, and built 2,000 DA 99 followed links.
Much of it was infographics, which are now synonymous with low quality ‘link bait’ content.
But that was okay, content back then didn’t have to set the world on fire.
Times have changed, and the big dogs are investing more of their Xmas TV ad budget into this whole ‘digital’ thing. This is starting to trickle down into SEO and link building- which is pushing standards up.
When a journalist has 400 press requests in their inbox, and many are well researched, innovative interactive assets; they are sadly unlikely to go for your infographic on the top 10 fictional books that appear in fiction.
Formats come and go, so innovation is key if you want to stay ahead.
Really scientific slide I presented to BrightonSEO about how link building tactics are getting more complex. You had to be there.
But that’s not to say you always need a huge interactive map to win big links. The story is what journalists are interested in. Even small brands with small budgets can be nimble, and outflank those glitzy campaigns you see from the big retail giants.
Outreach has changed a lot too.
You can’t just blast the same email to 200 journos, sit back, and watch the links flow in.
It’s a game of cat and mouse, and (sometimes frustratingly) we have to do the hard work for ’em by tweaking our content so that it’s bespoke and relevant for each publication.
The alliance between PRs and SEOs, especially at agencies, is stronger than ever. Agencies need personal people who can establish long term relationships with the key influencers in the right niches, as much as they need competent SEOs who can establish the right link building strategy for the client.
SF PR Manager Amy carries me every day.
Link building in 2020 is no easy feat, but it’s definitely still undervalued by a lot of brands. Work with the people who value it and you will still succeed.
3. Not Everything Has Changed
Despite the doom and gloom of those timely and well targeted yearly SEO prediction lists:
Google is still the dominant search engine.
Voice search still hasn’t transformed the landscape.
The Yahoo! Toolbar still exists.
Most content still isn’t video.
Users still trust organic results more than ads.
People still use ‘content is king’ in blog posts.
The Apple Watch didn’t transform local SEO… why would it have..?
The web isn’t 100% AMP and all held ransom in some dark Google-owned server.
There will come a time when SEOs no longer post memes on Twitter, but it is not this day.
You can’t (yet) automate good SEO.
I still have much to learn.
Hindsight still makes writing history easy.
Even in a global pandemic we keep getting results, attending virtual SEO conferences, and Friday beer o’clock is stronger than ever.
The Screaming Frog ‘Guess That WFH Desk!’ Quiz. Hours of fun.
4. Tech SEOs Have Had to Understand People Too
As with the macarena at the year 6 disco, algorithms are getting more complex, and so must our work to keep up with the crowd.
After real-time Penguin 4.0 launched in October 2016, Google’s fight against spam was largely over. They then turned their attention to improving how they understand website quality.
The speed and mobile friendliness revolutions have come and gone. Moves toward ‘page experience’ will be the next battleground, and this time they’ve been kind enough to give us a heads-up.
SEOs need to go beyond reviewing copy and 301 redirects if they want to stay ahead of the game.
We now need to be working with in-house marketing teams, designers, and developers to ensure that site design moving forward not only satisfies technical best practices, but also improves on what’s offered by competing sites to deliver a superior experience for users.
Searcher intent too, especially post-Medic Update, is an area that Google’s gotten much better at.
Through properly researching what sort of content is ranking for generic keywords that were previously exclusively ‘commercial’ or ‘informational’, we can do a better job of serving users exactly what they’re after.
Google often discuss the importance of ‘Micro-Moments’, which are the most crucial ways users interact with search before they take an action which might be beneficial to your business. There are four main micro-moments that Google highlights:
I want to know- about ways I can find love
I want to do- a course on writing persuasive text for billboards
I want to buy- a billboard to advertise that I’m single
I want to go- to the best bar in town with my date
If you’re only going for landing pages around ‘I want to buy’ then you’re missing out on a lot of your potential customer’s time and attention.
Google have always pushed that we should build sites for users. Previously you could skirt around this with black/grey hat techniques- but in 2020 it’s next to impossible to see long-term success without just focusing on what will make users happy.
5. You’re (Still) All A Bunch of Legends
This list has been a little depressing.
But the only reason SEO is becoming more challenging is because everyone in the industry is maturing and getting smarter.
And that’s largely due to the crazy amount of collaboration and support given out, even amongst rival agencies and freelancers.
Whether it’s detailed analysis, actionable advice, creative inspiration, inclusive career support, or ideas for blog posts to rip off, SEOs have always gone above and beyond to support each other. And a lot of it is free.
(But no, I will not update this post to give you a link. That’s where I draw the line.)
Frogs after a long day of drinking up knowledge at our fave conference
My IT teacher told 15 yr old me: “Most of you lot will have jobs that haven’t been invented yet.”
At the time I thought that was nonsense, but now I am a (Senior) Search Engine Optimisation Manager. That was never a thing before!!
For a young industry with not much in terms of ‘official’ recognition, it’s humbling to see the staggering amounts of resources, talks, and blogs available to help newcomers learn and improve on what has been written already. I can only imagine where we’ll be in another five years time. Especially on the meme front.
My dad’s an accountant. Can you imagine a bunch of accountants on Brighton pier sharing ideas over a beer?
Neither can I.
The post Five Ways SEO Has Changed In the Last Five Years appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
Everyone thinks they can write.
And it’s true – everyone can write. But not everyone writes well.
The potential of online copy shouldn’t be overlooked. It engages readers, helps them move around your site and convert.
So, how do you make your writing better? Slip in rhetorical questions to set you up to introduce your topic? Sometimes.
These are my tips for effective online copywriting.
A bit about online readers
Before you write anything, you need to know who you’re writing for.
Google Analytics tells you about your audience’s age, gender, interests and the device they’re using – but there’s something else you need to know about the online reader.
The human attention span is now just 8 seconds.
People simply can’t be bothered with things that don’t hold their attention. They know they can find something more engaging or insightful elsewhere, at the touch of a button.
Avoid long sentences, chunky paragraphs and complex language. Online readers want bite-sized tips and instructions. They want to find information easily, whether it’s product spec, breaking news or instructions.
Thick paragraphs are daunting to those who only searched ‘how to cook steak’. They don’t need to know you discovered your favourite spice on a trip to Sri Lanka in the summer of 1999.
Use headings to break up information and give each tip or theme its own paragraph. This makes it simple for users to navigate.
Q&As and bullet points are ideal for highlighting key product features and explaining your brand or service.
Place key information at the top of the page so users know what to expect. Then, give them the information they need in as few words as possible.
Your bounce rates will thank you.
Keep it chatty
Have you ever noticed your mouth moves when you read? Your lips and internal muscles in the tongue and larynx twitch as we scan text.
This tells us our writing needs to replicate speech. Readers are ‘talking through’ our writing, so it needs to sound natural. They don’t want to trip over clunky sentences and jargon.
Writing needs to have rhythm and sound chatty. Vary sentence length to replicate how you speak. And be flexible with what you learned at school. If starting a sentence with a conjunction like ‘and’ or ‘but’ sounds natural, go with it.
Don’t use words you wouldn’t say when speaking to someone. With online writing, you’re not trying to impress a teacher. You wouldn’t say ‘in addition’ or ‘moreover’ out loud, so don’t use them in writing.
Contractions (‘can’t’, ‘won’t’, ‘doesn’t’ etc) help build the rhythm of natural speech and sound friendly and conversational compared with their longer forms.
The power of words
A respected copywriter said, “writing for the sake of words is copywriting death” (Jack Prouse, 2020).
We know readers have short attention spans, so why waste our brief chance to engage them on words that add no value?
The biggest offender is adverbs. Words like ‘very’ and ‘fairly’ offer little extra meaning.
“This is a very important message”. Well, it’s not going to be slightly important, is it?
Adjectives have their place – especially in product descriptions and fiction writing – but ask yourself, are they giving the reader something they don’t already know?
I once read the following:
“London has seen a major explosion in the number of vegan restaurants this past year.”
A major explosion? What explosion isn’t major? The word ‘explosion’ speaks for itself.
Leave unnecessary adjectives out of your writing.
Lookin’ good
No one’s scoring the writing on your website. You don’t get marks for punctuation.
This doesn’t mean you should be intentionally breaking grammar rules but having freedom with punctuation can break up text and create a conversational tone.
For example, readers may be intimidated by semi-colons – I know it sounds silly but they’re formal and can interrupt the flow of the reader’s natural rhythm.
Swapping semi-colons for hyphens creates important white space on the page and allows for natural pauses in speech to emphasise each part of the sentence.
Break text up into short 1-2 sentence blocks. This is appealing to online readers as it’s digestible and helps them quickly locate the information they’re after.
Big it up
If you’re showing off your brand or a client, big them up. Don’t pull punches. Sell them.
The future tense construction – ‘will + verb’ – dilutes the power of your statement.
Take a look at the following sentences:
“The Screaming Frog SEO Spider website crawler will help you improve onsite SEO by…”
“The Screaming Frog SEO Spider website crawler helps you improve onsite SEO by…”
It’s only a small change but it makes the sentence bold and authoritative. There’s no possibility about your product or service, it just works.
Avoid modal verbs, too. Words like ‘can’, ‘should’ and ‘may’ dampen the impact of statements.
A key consideration
For your writing to be seen, it has to rank on search engines. So, it needs to answer common user questions and target popular queries around your topic.
Tools like Google Ads Keyword Planner provide topical search terms and their volume. But remember, this research needs to be done before writing. Keywords must inform copy, not get stuffed into existing pages as an afterthought.
Cover key search terms naturally by using them as a guide to structuring pages. If these are popular search terms, what does it suggest the user is looking for?
If they’re asking a question, answer it directly and concisely. If they’re looking to buy something, break down key product information in bullet points. If they’re looking for advice, provide actionable and chronological instructions.
Only work keywords into your writing where they read naturally. ‘Keyword stuffing’ (deliberately placing keywords into pages in the hope of ranking) is unhelpful for users and encourages them to bounce from your site. At worst, it can even be penalised by search engines.
Naturally fitting keywords into subheadings using a Q&A structure, for example, is a natural and helpful way to target terms – helping users quickly find the information they’re after.
Tools
We all need a little help sometimes. These free tools keep your writing at its best:
Grammarly – Grammarly is your personal proof-reader. The tool spots mistakes you missed in proofing, like absent words or misspellings. It also offers tips to make your writing more concise.
Hemingway Editor – Hemingway Editor scores your writing on its readability and offers suggestions to make it clearer and more engaging.
SMOG – The SMOG calculator gives you an idea of the reading ability a user needs to understand your writing. It’s ideal for those writing for different publications and audiences, to check their tone is suitable.
As with all writing, the most important things to consider are where it’s published and who’s reading it. Some of these tips may not be relevant when writing a whitepaper for business leaders, for example.
But the most effective online copywriting is simple, natural and recognisable. So, put these tips to use next time you’re crafting copy and let us know your go-to copywriting tips in the comments below.
The post Write Better: Tips For Effective Online Copywriting appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
We are excited to announce the release of Screaming Frog SEO Spider version 13.0, codenamed internally as ‘Lockdown’.
We’ve been busy developing exciting new features, and despite the obvious change in priorities for everyone right now, we want to continue to release updates as normal that help users in the work they do.
Let’s take a look at what’s new.
1) Near Duplicate Content
You can now discover near-duplicate pages, not just exact duplicates. We’ve introduced a new ‘Content‘ tab, which includes filters for both ‘Near Duplicates’ and ‘Exact Duplicates’.
While there isn’t a duplicate content penalty, having similar pages can cause cannibalisation issues and crawling and indexing inefficiencies. Very similar pages should be minimised and high similarity could be a sign of low-quality pages, which haven’t received much love – or just shouldn’t be separate pages in the first place.
For ‘Near Duplicates’, the SEO Spider will show you the closest similarity match %, as well as the number of near-duplicates for each URL. The ‘Exact Duplicates’ filter uses the same algorithmic check for identifying identical pages that was previously named ‘Duplicate’ under the ‘URL’ tab.
The new ‘Near Duplicates’ detection uses a minhash algorithm, which allows you to configure a near-duplicate similarity threshold, which is set at 90% by default. This can be configured via ‘Config > Content > Duplicates’.
Semantic elements such as the nav and footer are automatically excluded from the content analysis, but you can refine it further by excluding or including HTML elements, classes and IDs. This can help focus the analysis on the main content area, avoiding known boilerplate text. It can also be used to provide a more accurate word count.
Near duplicates requires post crawl analysis to be populated, and more detail on the duplicates can be seen in the new ‘Duplicate Details’ lower tab. This displays every near-duplicate URL identified, and their similarity match.
Clicking on a ‘Near Duplicate Address’ in the ‘Duplicate Details’ tab will display the near duplicate content discovered between the pages, and perform a diff to highlight the differences.
The near-duplicate content threshold and content area used in the analysis can both be updated post-crawl, and crawl analysis can be re-run to refine the results, without the need for re-crawling.
The ‘Content’ tab also includes a ‘Low Content Pages’ filter, which identifies pages with less than 200 words using the improved word count. This can be adjusted to your preferences under ‘Config > Spider > Preferences’ as there obviously isn’t a one-size-fits-all measure for minimum word count in SEO.
2) Spelling & Grammar
If you’ve found yourself with extra time under lockdown, then we know just the way you can spend it (sorry).
You’re now also able to perform a spelling and grammar check during a crawl. The new ‘Content’ tab has filters for ‘Spelling Errors’ and ‘Grammar Errors’ and displays counts for each page crawled.
You can enable spelling and grammar checks via ‘Config > Content > Spelling & Grammar’.
While this is a little different from our usual very ‘SEO-focused’ features, a large part of our roles are about improving websites for users. Google’s own search quality evaluator guidelines outline spelling and grammar errors numerous times as one of the characteristics of low-quality pages (if you need convincing!).
The lower window ‘Spelling & Grammar Details’ tab shows you the error, type (spelling or grammar), detail, and provides a suggestion to correct the issue.
The right-hand-side of the details tab also shows you a visual of the text from the page and errors identified.
The right-hand pane ‘Spelling & Grammar’ tab displays the top 100 unique errors discovered and the number of URLs it affects. This can be helpful for finding errors across templates, and for building your dictionary or ignore list.
The new spelling and grammar feature will auto-identify the language used on a page (via the HTML language attribute), but also allow you to manually select language where required. It supports 39 languages, including English (UK, USA, Aus etc), German, French, Dutch, Spanish, Italian, Danish, Swedish, Japanese, Russian, Arabic and more.
You’re able to ignore words for a crawl, add to a dictionary (which is remembered across crawls), disable grammar rules and exclude or include content in specific HTML elements, classes or IDs for spelling and grammar checks.
You’re also able to ‘update’ the spelling and grammar check to reflect changes to your dictionary, ignore list or grammar rules without re-crawling the URLs.
As you would expect, you can export all the data via the ‘Bulk Export > Content’ menu.
Please don’t send us any ‘broken spelling/grammar’ link building emails.
3) Improved Link Data – Link Position, Path Type & Target
Some of our most requested features have been around link data. You want more, to be able to make better decisions. We’ve listened, and the SEO Spider now records some new attributes for every link.
Link Position
You’re now able to see the ‘link position’ of every link in a crawl – such as whether it’s in the navigation, content of the page, sidebar or footer for example. The classification is performed by using each link’s ‘link path’ (as an XPath) and known semantic substrings, which can be seen in the ‘inlinks’ and ‘outlinks’ tabs.
If your website uses semantic HTML5 elements (or well-named non-semantic elements, such as div id=”nav”), the SEO Spider will be able to automatically determine different parts of a web page and the links within them.
But not every website is built in this way, so you’re able to configure the link position classification under ‘Config > Custom > Link Positions’. This allows you to use a substring of the link path, to classify it as you wish.
For example, we have mobile menu links outside the nav element that are determined to be in ‘content’ links. This is incorrect, as they are just an additional sitewide navigation on mobile.
The ‘mobile-menu__dropdown’ class name (which is in the link path as shown above) can be used to define its correct link position using the Link Positions feature.
These links will then be correctly attributed as a sitewide navigation link.
This can help identify ‘inlinks’ to a page that are only from in-body content, for example, ignoring any links in the main navigation, or footer for better internal link analysis.
Path Type
The ‘path type’ of a link is also recorded (absolute, path-relative, protocol-relative or root-relative), which can be seen in inlinks, outlinks and all bulk exports.
This can help identify links which should be absolute, as there are some integrity, security and performance issues with relative linking under some circumstances.
Target Attribute
Additionally, we now show the ‘target’ attribute for every link, to help identify links which use ‘_blank’ to open in a new tab.
This is helpful when analysing usability, but also performance and security – which brings us onto the next feature.
4) Security Checks
The ‘Protocol’ tab has been renamed to ‘Security’ and more up to date security-related checks and filters have been introduced.
While the SEO Spider was already able to identify HTTP URLs, mixed content and other insecure elements, exposing them within filters helps you spot them more easily.
You’re able to quickly find mixed content, issues with insecure forms, unsafe cross-origin links, protocol-relative resource links, missing security headers and more.
The old insecure content report remains as well, as this checks all elements (canonicals, hreflang etc) for insecure elements and is helpful for HTTPS migrations.
The new security checks introduced are focused on the most common issues related to SEO, web performance and security, but this functionality might be extended to cover additional security checks based upon user feedback.
5) Improved UX Bits
We’ve found some new users could get confused between the ‘Enter URL to spider’ bar at the top, and the ‘search’ bar on the side. The size of the ‘search’ bar had grown, and the main URL bar was possibly a little too subtle.
So we have adjusted sizing, colour, text and included an icon to make it clearer where to put your URL.
If that doesn’t work, then we’ve got another concept ready and waiting for trial.
The ‘Image Details’ tab now displays a preview of the image, alongside its associated alt text. This makes image auditing much easier!
You can highlight cells in the higher and lower windows, and the SEO Spider will display a ‘Selected Cells’ count.
The lower windows now have filters and a search, to help find URLs and data more efficiently.
Site visualisations now have an improved zoom, and the tree graph nodes spacing can be much closer together to view a site in its entirety. So pretty.
Oh, and in the ‘View Source’ tab, you can now click ‘Show Differences’ and it will perform a diff between the raw and rendered HTML.
Other Updates
Version 13.0 also includes a number of smaller updates and bug fixes, outlined below.
The PageSpeed Insights API integration has been updated with the new Core Web Vitals metrics (Largest Contentful Paint, First Input Delay and Cumulative Layout Shift). ‘Total Blocking Time’ Lighthouse metric and ‘Remove Unused JavaScript’ opportunity are also now available. Additionally, we’ve introduced a new ‘JavaScript Coverage Summary’ report under ‘Reports > PageSpeed’, which highlights how much of each JavaScript file is unused across a crawl and the potential savings.
Following the Log File Analyser version 4.0, the SEO Spider has been updated to Java 11.
iFrames can now be stored and crawled (under ‘Config > Spider > Crawl’).
Fragments are no longer crawled by default in JavaScript rendering mode. There’s a new ‘Crawl Fragment Identifiers’ configuration under ‘Config > Spider > Advanced’ that allows you to crawl URLs with fragments in any rendering mode.
A tonne of Google features for structured data validation have been updated. We’ve added support for COVID-19 Announcements and Image Licence features. Occupation has been renamed to Estimated Salary and two deprecated features, Place Action and Social Profile, have been removed.
All Hreflang ‘confirmation links’ named filters have been updated to ‘return links’, as this seems to be the common naming used by Google (and who are we to argue?). Check out our How To Audit Hreflang guide for more detail.
Two ‘AMP’ filters have been updated, ‘Non-Confirming Canonical’ has been renamed to ‘Missing Non-AMP Return Link’, and ‘Missing Non-AMP Canonical’ has been renamed to ‘Missing Canonical to Non-AMP’ to make them as clear as possible. Check out our How To Audit & validate AMP guide for more detail.
The ‘Memory’ configuration has been renamed to ‘Memory Allocation’, while ‘Storage’ has been renamed to ‘Storage Mode’ to avoid them getting mixed up. These are both available under ‘Config > System’.
Custom Search results now get appended to the Internal tab when used.
The Forms Based Authentication browser now shows you the URL you’re viewing to make it easier to spot sneaky redirects.
Deprecated APIs have been removed for the Ahrefs integration.
That’s everything. If you experience any problems, then please do just let us know via our support and we’ll help as quickly as possible.
Thank you to everyone for all their feature requests, feedback, and bug reports. Apologies for anyone disappointed we didn’t get to the feature they wanted this time. We prioritise based upon user feedback (and a little internal steer) and we hope to get to them all eventually.
Now, go and download version 13.0 of the Screaming Frog SEO Spider and let us know what you think!
Download Now
The post Screaming Frog SEO Spider Update – Version 13.0 appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
We’ve all been there, stuck staring at your computer before a content ideation session wondering how to come up with an idea that will attract big media placements that encourage long term SEO value. It is certainly not an easy task and combining this with journalists becoming savvier against blatant linkbait tactics and bloggers trying to capitalise on desperate link builders with hefty fees, linkable ideas that seem a natural fit within a client’s content calendar are becoming increasingly important.
Here at Screaming Frog, my colleague (James McCrea) and I have created this comprehensive guide to ease the pressure of building links and get you on your way to forming that perfect idea.
“Forget the light bulb moment”
While some of you may rely on sudden moments of inspiration to get everything into place, 95% of the time we discourage this as your primary strategy. It can happen, you sit there drinking a cup of tea and suddenly realise how to get a toilet cleaning supply company featured on The Guardian.
Unfortunately, sorry to be a buzzkill, but without doing your due diligent research you will form an idea that gets just one “nofollow” link on a low authority site that positions your work underneath a male enhancement advertisement.
Instead, you should follow these steps:
Step 1 – Understand your client
It’s important your content marketing project is a natural fit within a client’s content schedule and their tone of voice. As a result, you will need to know what ideas they have previously tried, what themes they should avoid, and so on.
The easiest way to start here is by looking at their blog, searching for their brand on Google News to identify previous success and assessing their social media profiles. If they have previous topics that have been successful, it is worth assessing whether there is scope for more content within that topic that they haven’t explored previously.
It can also be helpful to learn a little about the content of the competition and find out what has worked for them in the past. A quick way to do this is to take the URL of the site you want to inspect, enter it into Ahrefs’ Site Explorer and click on ‘Top content’.
This will order the site’s pages by “social power”, a combined metric of referring domains, Facebook, Twitter and Pinterest shares. While this feature will allow you to easily spot previous link building competitor campaigns, smaller sites often don’t have as much data to analyse.
In these instances, scanning through “Best by links” can be a good solution to start to pick out patterns in the content of the client and their competitors, to see what has previously been most effective.
Step 2 – Understand your goal 
While we certainly want a client’s audience to enjoy our content, there needs to be a degree of honesty in the aim of our work. Try not to focus too much on content that you think will rank for valuable terms, generate links, social shares and increase sales all in one.
Unfortunately, if you do, what suffers most is what you have been paid to do – generate links. The outcome of chasing all four metrics will often result in boring and confusing content that no one will want to link to.
By being honest about our goal to generate authoritative links, we can expand our thinking process of what our client’s target audience could be interested in, beyond their services or products.
Step 3 – Identify shareable content topics that resonate with your client’s niche
A crucial step in forming a great idea is to identify shareable topics within your client’s niche. The most important strategy here is to always keep up to date with the news and always be on the lookout for topics that could relate to your client’s brand. While the news is currently being dominated by COVID-19, there are still topical events like the U.S. presidential election. Therefore, presidential campaigns past and present could be an interesting theme.
Alternatively, another strategy is to search for upcoming calendar events and assess whether they relate to your client’s business. A great website for this is Awareness Days, they list a number of different events throughout the year that every content marketer & digital PR expert should know about. For example, the beginning of July marks ‘Plastic Free July’ – a global movement to help millions reduce their plastic consumption for an entire month. This could be significantly relevant to any clients that have plastic-free or eco-friendly products/services. Creating newsworthy content around plastic as a topic would therefore be a decent way to generate backlinks to your client’s site during this month.
There aren’t always going to be breaking news stories or calendar events relating to your client’s brand. To identify content topics that are more evergreen, there are two different tools you can use to help you:
BuzzSumo
If you feel your client has the luxury of being in a marketable industry, and there are a broad range of relevant topics being covered in the media, you can start to brainstorm your own topics that you think would be interesting. A useful feature here is to use BuzzSumo’s topic generator. This will give you a list of topics to input into their content analyzer. From this, you should be able to find out which topics are getting more coverage (not all topics are equal, the media has a natural bias to specific topics).
Let’s say your client operates in the travel sector, you may think some interesting topics are “sustainable travel”, “stressful air travel” and “weddings abroad”. Alternatively, you may have found these topics from a broad “travel” search using the topic generator. The content analyzer returns the following results for these topics:
“Sustainable Travel” – highest shared article received 40 links and 15k Facebook shares
“Stressful Air Travel” – highest shared article received 12 links and 27.8k Facebook shares
“Weddings Abroad” – highest shared article received 3 links and 5k Facebook shares
While BuzzSumo’s link data can sometimes be inaccurate, their social data is very solid, allowing us to make judgements on shareable topics. ‘Sustainable Travel’ and ‘Stressful Air Travel’ seem to be significantly more shareable than ‘Weddings Abroad’ in this instance. Therefore, we are already starting to validate what might work and what might not work.
Another useful tactic is to input the domains of target publishers into BuzzSumo’s content analyzer. For example, if you know that your client wants links from Conde Nast Traveller, you can input their domain as show below:
What returns is a list of their most engaged with articles over the past year (although you can filter to different points if you want to be more specific). Here we get an idea of how their team writes headlines, whether they are featuring infographics, surveys or indexes, as well as some content topics Conde Nast journalists have had success with previously (and hence may want to repeat).
Considering journalists get judged on the number of social shares their content has, it is a safe bet they will want to explore content topics again if they have worked previously. Their most engaged with article assesses the link between wildlife and lack of activity during lockdown. As long as too much time doesn’t pass between your ideation and getting your content live, this could be a topical content theme that your key target publisher is also interested in.
Reddit
It is important to sanity check your conclusions from BuzzSumo research. Reddit is a useful tool here as it allows you to easily become absorbed within a client’s niche. You will need to look at relevant subreddits to understand what your target audience is talking about. For example, if your client operates within the human resources sector, it may be worth looking into relevant subreddits such as:
reddit.com/r/entrepreneur
reddit.com/r/BusinessHub
reddit.com/r/CareerSuccess
reddit.com/r/growmybusiness
Step 4 – Identify questions about your identified topics
Now you have identified some popular topics within a client’s industry, you will need to narrow down what questions your content is going to answer. Following on with our example, you have already identified sustainable travel as one of the most popular topics within travel at the moment, but what will your content answer about sustainable travel?
To look at some popular questions people have regarding sustainable travel, use BuzzSumo’s Question Analyzer. This handy feature pulls in questions from Reddit & Quora that include your topic.
Immediately you will see there are a number of questions around sustainable travel ordered by popularity:
What are tips for sustainable travel?
What are some interesting ethical and sustainable travel suggestions?
Where are sustainable traveling jobs?
When using this feature, it is worth bearing in mind that “why” questions will likely need attitudinal data to answer them. This is most commonly answered with survey data (although there can be exceptions), which can sometimes be expensive to fund.
To find more questions you can also use free tools, such as AnswerThePublic:
This gives you some options of what your content could potentially answer with the latest available data.
Step 5 – Identify emotional hooks
It’s always important to make sure your content has an aspect that elicits some sort of emotional response for your audience – this could be anything from joy or amusement to surprise or fear.
It’s easier to do this if the idea involves a relatable element to the reader: for instance, a regional based idea where different locations are compared against each other using data, might induce competitiveness from readers as they compare how their local city or country scores against others. Similarly, useful guide form content might spark intrigue by giving users direct and easily-actionable advice.
Step 6 – Start to predict some headlines and angles
The best ideas can often be judged on their headlines.
While you will not always know the best angles and resulting headlines before you start your research, you should be able to predict some that could apply.
As an agency, we always encourage people to think about headlines for their ideas that may appeal to journalists. While thinking about angles, try to think how your content can emotionally tell a story or how you can positively position your client’s brand as an expert in a newsworthy debate.
The best angles and headlines are unexpected and aim to actively question what we already know about a topic or bring fresh information into a topical debate. Sticking with the ‘sustainable travel’ example, if you wanted to explore ‘where has the most sustainable travel opportunities’, it is likely you already have some biases as to where has the least – think about whether these biases are likely to topical (e.g. have certain countries recently been criticised/praised for their sustainability, or lack of sustainability).
In other words, the angles you should be thinking of here are:
“Are we wrong about X? They have just featured as the most sustainable for travel”
“First X ranks as the lowest for carbon emissions, now it is officially the best for sustainable travel”
Step 7 – Which format best highlights what I want to answer? 
The format of your piece needs to be routed into the question you want answered. E.g:
“What are tips for sustainable travel?” – you may want to provide a visually appealing infographic.
“What are some interesting ethical and sustainable travel suggestions?” – you may want to rank destinations in an index by their ability to encourage sustainable travel (which city has the most sustainable travel tours, sustainable transport & food options etc.)
“Where are sustainable travelling jobs?” – you may want to scrape data from Indeed/Glassdoor to find where in the world has the most sustainable travel jobs and find the roles with the most positions.
It’s also important to ensure that your format displays the piece’s information in a way that is clear and easy to understand. The more effort the audience has to put in to work out what your content is saying, the more likely it is that they’ll lose interest.
You can test this out by showing an early draft of your content to people who haven’t previously seen it and asking them how long they took to properly understand it. This is particularly true for journalists, who receive a lot of pitches every day and don’t have the time to properly delve into them all. Therefore, they’re more likely to go for content that’s immediately understandable.
While it’s great to try to stand out from the crowd, it might be a good idea to keep things fairly simple rather than to create something complex that requires more time to get to grips with it.
Step 8 – Are there credible sources to support my idea? 
All content needs supporting data. Even if you are looking to create an artistic visual on “what tourist landmarks could look like as a result of climate change” for your “sustainable travel” topic, you will need some credible sources to back up your claims. Make sure you have searched the internet for all publicly available data.
There might be instances where you don’t need to frantically search through Google Dataset Search. If the idea is directly linked to your client’s products, they are likely to have internal data that may support your idea! Furthermore, if the client is open to a bit more expense (or you are able to creatively play around with budgets), surveys can often be a fantastic way of creating your own datasets to directly answer any questions you want to know about.
Another route you could go down in ideation is to start by looking for interesting data and see if this gives you any content ideas. However, it’s still important to go through all the other steps in this article if you do so.
Step 9 – Sanity check your idea 
If you followed the previous steps, you should have formed an idea that:
Is in line with your client’s industry and fits their tone of voice and content schedule
Is related to a topic that is topical and being shared at a high rate in the media
Answers a specific question that target audiences want answering within that topic
Is in a format which provides use to the overall goal of the content
Has valid sources to support it
To fully check if your idea is likely to be successful in attracting media placements, it is worth double checking your idea against the following questions:
“What could some news headlines for this campaign be?”
“Has someone currently done this? If so, can we do it better?”
“Has the type of content theme worked before? If not, why would it work now?”
“Will this content leave a favourable impression to the client’s current and new customers?”
“What publications/bloggers would write about it?”
Here at Screaming Frog, we have previously developed the STEPPS to SUCCES acronym that allows you to sanity check your content.
Conclusion
While there is certainly no right or wrong way to go about content ideation for link building, it is important to strategise and produce a ‘method to the madness’.
If you are just starting or an experienced professional, we all know how difficult it can be to get everyone on board with content campaigns.
Unlike technical and on-page SEO, everyone has an opinion on what content they think will work. Because of this, in depth research into your client’s industry is critical.
Using tools such as BuzzSumo, Reddit, Quora and AnswerThePublic, you are able to provide numerical evidence that backs up your idea. No one can disagree with data. Therefore, when clients or colleagues seem to be at a conflict of interest, you can rely on this data to resolve any issues.
By following these 9 steps, you are on your way to improving your ideas. However, content ideation is a craft and like any craft, practice makes perfect.
Useful resources
To truly come up with the best ideas, you need to fully immerse yourself in the industry and draw inspiration from experts, in addition to following these steps. We have previously listed some of our favourite sources of content inspiration.
The post How to Ideate Content Marketing Campaigns appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
The recent pandemic has completely restructured and realigned the way in which we live and work. Advancement in technology and shifts in organisational culture and value has meant that working from home has in more recent years become increasingly common and encouraged. However, for many, the sudden requirement to re-locate to a home office for an extended period will be a first-time experience and one that naturally brings its own challenges. Maintaining a calm, positive mindset and productive level of work can prove difficult when the outside world is going through turmoil and disruption. The following blog will provide 5 tips I have found useful to keep focused, productive and most importantly, positive during one of the most unprecedented times we have faced.
1) Allocate Yourself A ‘Workspace’
Whether you are living alone in a studio apartment, at home with 2 dogs and a family or sharing a house with friends, finding and creating your new workspace with as few distractions as possible is vital for maintaining routine and productivity. Many of us are familiar with having clear distinctions between work and home life. This defined balance and physical separation means that we can disconnect and re-charge at the end of our working day. Having a designated workspace where you can ‘enter’ in the morning and ‘leave’ at the end of the day ensures that your mind is engaging and disengaging from the working day and keeping your personal life as separate as possible.
I have also found it helpful to work somewhere with close proximity to a window for fresh air, as well as an area with a door or some form of separation from the rest of the house, as it is a great way to signal to other members of your household that you are busy. Ultimately, this new area is where you will be spending the majority of your day so anything to make you feel comfortable is a must.
2) Turn off Notifications
Many of us will be wanting to keep updated with the current affairs and any breaking news. However, this can ultimately lead to distractions as well as stress and anxiety. Currently, the media is saturated with important guidance and information on the pandemic, however in addition to this, there can be stories written by journalists that spark increased confusion and concern. Turning off notifications when you are completing a task or wanting to focus most, can help with productivity. In addition to this, ensuring that you have a trusted news source that you can refer to when you require information is important.
This suggestion not only applies to news apps, but also to apps such as Instagram, Facebook, Whatsapp, Twitter etc. wherein a home environment away from your team you may find it tempting to spend more time scrolling through the news feed. There are several apps on the App Store that help with smartphone distractions. One I find useful is called Focus To-Do: Focus Timer & Tasks. This app brings together the popular Pomodoro Technique, which is a time management method designed to make you work in intervals of traditionally 25 minutes separated by short breaks, with a useful To-Do List option. By prioritising your work tasks on the To-Do list and utilising the timer, it’s a great way to really focus on the task at hand and in turn, improve productivity.
3) Slack, Call, Email, Zoom – Whatever it takes; COMMUNICATE!
This is an absolute must. Good relationships with team members and managers contribute significantly to job satisfaction, healthy work environments and productivity. Now we are remotely working, it’s more important than ever to find new ways to connect and stay in touch with your team, not only from a social perspective, but also to ensure that workflow and projects are running smoothly. We are used to easy and effortless communication in the office, as quick as swinging around on the office chair or walking over to a colleague’s desk. Working from home can, therefore, feel relatively unstructured and isolating, making it even more vital to put a lot of emphasis on communicating with your team and actively making the effort to ask how they are doing and realign focus.
Something my team and I have made a conscious effort to do since the start of lockdown is a virtual ‘PPC Team Huddle’. Each morning, we set aside the time to video call on Microsoft Teams to check in with everyone, the on-going work, what happened the previous day and what is upcoming. At times when you feel as though you are losing touch with your teammates, this can prove extremely useful and reassuring. Additionally, it helps everyone concentrate on the outstanding tasks and can set a clear focus for the day, increasing productivity.
4) Keep Up a Routine
This may seem blatantly obvious, however, maintaining some form of routine working from home will help keep things as normal as possible, as well as keep you motivated and productive throughout the day.
Despite many of us having much more flexibility with our schedules, maintaining some form of consistency can help your body and mind adjust accordingly, regulate any underlying feelings of stress and put you into the right mindset for the working day. It also helps with defining that distinction between work and personal life. Your commute from home to the office not only gets you to a physical location but also into the right frame of mind. The time commuting in the mornings and evenings helps your mind ease into work preparing you for the day to come or allows you to switch off when the day has come to an end. Take the time in the morning to do what you would usually; eat breakfast in the kitchen, listen to music away from your designated workspace, meditate, exercise, whatever it may be to help wake you up and feel ready and energised for the workday. This routine will in turn, help you feel less anxious and more focused so that your day can be productive.
5) Take Regular Breaks
Actively take time throughout your working day to get up, move, rehydrate away from the screen. It’s human to get distracted and there will always be distractions. After speaking to Briony Gunson who works at beanddo during a very insightful session on “Helpful Habits – Focus, Productivity and Wellbeing”, I learnt that we lose attention 6-10 times per minute! By acknowledging that our minds need to take a couple minutes to re-focus is okay and important for maintaining productivity levels.
Hopefully these 5 tips can help you maintain productivity levels during these very uncertain times. It would be great to hear any other tips and suggestions, so if you would like to leave a comment below please don’t hesitate to do so!
The post Working from Home During the COVID-19 Pandemic – 5 Ways to Maintain Productivity appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
I am pleased to announce the release of the Screaming Frog Log File Analyser 4.0, codenamed ‘Stay At Home’.
If you’re not already familiar with the Log File Analyser tool, it allows you to upload your server log files, verify search engine bots, and get valuable insight into search bot behavior when crawling your website.
While everyone has more important things on their minds at this time, this small release includes limited, but essential updates that will help improve tracking of evergreen Googlebot and Bingbot and more.
Let’s jump straight to them.
1) Wildcard User-Agent Matching
You’re now able to use wildcard matching when configuring user-agents you wish to import into a project. This makes it far more flexible, particularly when user-agents strings change regularly, such as the new evergreen Googlebot and Bingbot.
You can choose from our pre-defined list of common search engine bots, or configure your own.
The default user-agent wildcard matching for Googlebot and Bingbot has also been updated to improve tracking of their ‘evergreen’ version naming.
2) Remove Parameters
You’re now able to supply a list of parameters to strip from URLs and consolidate when you import log files.
This is available in the ‘new’ project configuration and is particularly useful when you have known parameters or issues, and need to consolidate data.
3) New JSON Timestamp Support
In version 3.0, we provided support for log files in JSON format. There isn’t a common standard, so we have utilised customer provided JSON formats, and provided support for as many as possible.
This support has now been extended further to cover some less common JSON timestamp examples we have been provided by users. All you need to do is drag and drop in log files (or folders) as usual, and the Log File Analyser will automatically detect the format and analyse them.
Tumblr media
Thanks to everyone for their examples, and keep them coming!
4) Java 11 Update
While this will make little practical difference to many users, behind the scenes we have updated to Java 11. Our SEO Spider will be following soon.
More To Come
As outlined above, this is just a small update for now. However, there’s lots more to come and please do keep sending in your feature requests with what you’d like to see in the Log File Analyser.
If you’re looking for inspiration for log file analysis, then check out our guide on 22 ways to analyse log files for SEO.
As always, thanks to the SEO community for your support, and please let us know if you experience any issues with version 4.0. Keep safe, well and mostly inside for now!
The post Screaming Frog Log File Analyser Update – Version 4.0 appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
If you’ve not heard of ‘broken link building’ before, it’s essentially a tactic that involves letting a webmaster know about broken links on their site and suggesting an alternative resource (perhaps your own site or a particular piece of content, alongside any others).
There’s a couple of ways that link builders approach this, which include –
Collecting a big list of ‘prospects’ such as resource pages or pages around a particular content theme or search phrase. Then checking these pages for broken links.
Another method is simply picking a single site, checking the entirety of it for relevant resource pages and broken links (and potentially creating content that will allow you to suggest your own site).
I don’t want to dig to deep into the entire process, you can read a fantastic guide over here on Moz by Russ Jones. However, as we get asked this question an awful lot, I wanted to explain how you can use the Screaming Frog SEO Spider tool to help scale the process, in particular for the first method listed above.
1) Switch To List Mode
When you have your list of relevant prospects you wish to check for external broken links, fire up the Screaming Frog SEO Spider & switch the mode from ‘Spider’ to ‘List’.
2) Remove The Crawl Depth
By default in list mode the crawl depth is ‘0’, meaning only the URLs in your list will be crawled.
However, we need to crawl the external URLs from the URLs in the list, so remove the crawl depth under ‘Configuration > Spider > Limits’ by unticking the configuration.
3) Choose To Only Crawl & Store External Links
With the crawl depth removed, the SEO Spider will now crawl the list of URLs, and any links it finds on them (internal and external, and resources).
So, next up you need to restrict the SEO Spider to just crawl external links from the list of URLs. You don’t want to waste time crawling internal links, or resources such as images, CSS, JS etc.
So under ‘Configuration > Spider > Crawl’, keep only ‘External Links’ enabled and untick all other resource and page link types.
This will mean only the URLs uploaded and the external links found on them will be stored and crawled.
4) Upload Your URLs & Start The Crawl
Now copy your list of URLs you want to check, click ‘Upload > Paste’ and the SEO Spider will crawl the URLs, reach 100% and come to a stop.
5) View Broken Links & Source Pages
To view the discovered external broken links within the SEO Spider, click on the ‘Response Codes’ tab and ‘Client Error (4XX)’ filter. They will display a ‘404’ status code.
To see which page from the original URL list uploaded has the broken links on them, use the ‘Inlinks’ tab at the bottom. Click on a URL in the top window pane and then click on the ‘Inlinks’ tab at the bottom to populate the lower window pane.
You can click on the above to view a larger image.
As you can see in this example, there is a broken link to the BrightonSEO website (https://ift.tt/2TangwT), which is linked to from this page – https://ift.tt/2GhaWYC.
6) Export Them Using ‘Bulk Export > All External Links’
This export will contain all URLs in the list uploaded, as well as their external links and various response codes.
7) Open Up In A Spreadsheet & Filter The Status Code for 4XX
The seed list of URLs uploaded are the source URLs in column B, while their external links which we want to check for broken links are the destination URLs in column C. If you filter the ‘status code’ column, you may see some ‘404’ broken links.
Here’s a quick screenshot of a dozen blog URLs I uploaded from our website and a few well know search marketing blogs (click for a larger image as it’s rather small).
Tumblr media
So that’s it, you have a list of broken links against their sources for your broken link building. You can stop reading now, but just checking for 4XX errors will mean you miss out on further opportunities to explore.
This is because URLs might not 404 error correctly, or immediately. Quite often a URL will 302 (301, or 303) once or multiple times before reaching a final 404 status. Some URLs will also respond with a ‘no response’, such as a ‘DNS lookup failed’ if they no longer exist at all. So scan through the URLs under ‘Response Codes > No Response’ and check the status codes for further prospecting opportunities.
For 3XX responses, auditing these at scale is a little more involved, but quick and easy with the right process as outlined below.
1) Filter For 3XX Responses In The ‘Destination URL’ Column
Using the same ‘External Links’ spread sheet, scan the ‘destination URLs’ list for anything unnecessary to crawl. It will undoubtedly contain links like Twitter, Facebook, LinkedIn and login URLs etc which all redirect. Run a quick filter on this column and mass delete all the rubbish from the list to help save time from crawling them.
2) Save This New 3XX List
You’ll need this list later to potentially match back the destination URL which is 3XX’ing to its originating source URL. This is what was left in my list after cleaning up, which we need to audit.
Tumblr media
3) Now Audit Those Redirects
Follow the process outlined in the ‘How To Audit Redirects‘ guide by saving the ‘destination URLs’ into a new list and crawling until their final target URL using the ‘always follow redirects‘ configuration to discover any broken links.
The ‘All Redirects’ report will provide a complete view of the hops and display the final destination URL.
4) Match The 4XX Errors Discovered Against Your Saved 3XX List Source URLs
The ‘All Redirects’ report may contain 4XX errors you would have missed if you hadn’t audited the 3XX responses. For example, here are a couple more I discovered using this method –
Tumblr media
The above contains a URL which 301’s to a 404 and another with a soft 404, a 302 to a 200 response. With this report you can match the ‘address’ URLs in ‘column A’ back to the ‘destination URLs’ and subsequent ‘source URLs’ from your saved 3XX list. Both of the above in this example come from the same blog post for example.
Hopefully the above process helps make broken link building more efficient. Please just let us know if you have any questions in the comments as usual.
Please remember!
This post is specifically about using the SEO Spider for broken link building. If you’re just looking to discover broken links on a single website, read our guide on How To Find Broken Links.
The post How To Use The SEO Spider For Broken Link Building appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
As the year comes to an end here at Screaming Frog, we thought we’d reflect on the ups and downed drinks throughout the past 12 months. Here’s what went down:
Spider training in London
We kicked this year off with our first-ever Screaming Frog Spider training, hosted by former frog Charlie Williams (@pagesauce). 
The first training session took place in Marble Arch, London and was a great success. We were graced with visitors from Cornwall, France and Germany!
Conferences
BrightonSEO
In April, we had our very own @OliverBrett discussing ‘How to Make Fake News for Links.‘
Rumours circulated that it was ‘Easily the BEST talk ever seen’ by many accredited in the industry (Brett, 2019). 
In September, The Screaming Frog Stand returned to Brighton SEO after last year’s success. 
This year the team came equipped with some fresh Screaming Frog merch to dish out, featuring frog-branded baseball caps/snapbacks, new for 2019.
SearchLeeds
Stop number two on @OliverBrett’s talking tour landed him at SearchLeeds. Returning after popular demand, Oliver graced the stage discussing how a ‘pee cape’ and ‘Solar Led T-Shirt’ can land you some pretty impressive national coverage.
It was great to see lots more local meetups and new conferences pop up in and around Oxfordshire.
Unboxed Oxford
The SEO team visited Oxford to attend Unboxed, a conference in a cinema also hosted by Charlie. The scenic rooftop views, excellent speakers and fun merchandise (including honey gin) make it an event not to be missed. 
ReadingSEO Meetup(s)
The team were frequent attendees of the newfound ReadingSEO Meetup organised by Reading-based SEO Agency Blue Array.
Head of SEO at Screaming Frog Patrick Langridge and former frogs Faisal Anderson and Daniel Cartland also attended as speakers.
Women in Tech SEO
Screaming Frog SEO Manager, Caroline, and Senior SEO Consultant Laura went along to one of the first Women in Technical SEO meetups in London organised by the fantastic Areej AbuAli.
It was great to attend such an event, and we have already bought our tickets to the Women in Technical SEO conference in March 2020.
Creative Duck
Some of the Screaming Frog SEO team took to the stage for their first ever public speaking appearance at the Creative Duck SEO Meetup.
The team covered the basics of SEO, link building and free SEO tools to a crowd of local businesses including wedding planners, interior designers and other local creative industries.
Fundraising
The Screaming Frog run club returned this year to raise money for Alzheimer’s Society.
unFortunately, the Reading Half Marathon was not called off due to weather conditions (unlike last year).
We were also lucky enough to have the scenic views of the Oxfordshire countryside to help us along our way and slowly build up the miles.
Although, not all training sessions were as successful as others…
Before Thank you to Alzheimer’s society who kindly kitted us out for the event.  After… All the hard work paid off and Screaming Frog run club all completed the course and collaboratively raised an impressive £2,300 for Alzheimer’s Society. Thank you to everyone who kindly supported the team and donated to the fantastic charity.
We certainly celebrated in style  (it just so happened to be a St. Patrick’s Day).
Screaming Frog International
This year, the frogs embarked on a world tour talking at various SEO events around the globe.
Paris
(Href)Pat Lang kicked off the tour en francais at SEO CAMP’us Paris. 
Vegas
But that was just his warm-up taking to the stage at Pubcon in Vegas later on in the year. 
 Bali
Speaking at Brighton SEO and Leeds this year wasn’t enough for @LordOfTheSERPs. Oliver worked very hard on his carbon footprint technical SEO knowledge, talking about the SEO Spider at Digital Marketing Skillshare in Bali.
Naturally, @LordOfTheSERPs paid a visit to Hobbiton, NZ before returning back home. 
Socials
Regatta
Before we knew it, the sun was out and pints were swapped for Pimms at Henley Regatta.
Best dressed went to Jason for his sheer commitment…
Oktoberfest
For the third year running, the frogs returned to Oktoberfest singing their hearts out to Ooom Pah Pah…
It’s not all beers and pub crawls at SF socials. This year the team took on Reading Escape Room (successfully) and tested their track skills at Team Sport Reading.
View this post on Instagram
The 2019 SF Grand Prix
A post shared by Screaming Frog (@screamingfroguk) on Jun 7, 2019 at 2:24pm PDT
UK Search Awards
This year, Screaming Frog had the pleasure of being shortlisted for six search awards:
Best SEO Campaign
Best Low Budget Campaign
Best Use of Content Marketing
Best Use of Search – Finance (SEO)
Best Use of Travel /Leisure (SEO)
Young Search Professional of the Year
We were delighted to win both the Best Low Budget Campaign and Best Use of Search – Finance (SEO).
Christmas Party
To finish of a manic year, the team headed out on the annual Christmas do. Suited and booted we took to took on Brakespear’s finest pubs, playing pool and darts before returning to the office for secret santa and the infamous SF awards.
Like what you see? Come and work for us!
We’ve welcomed a whole host of new team members to Screaming Frog this year and we’re still on the look out for more.
We are constantly on the look for new team members.  Why don’t you check out what we’ve got to offer!
The post Screaming Frog 2019 – A Year in Review appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Link
via Screaming Frog
Earlier last week the UK Search Awards was hosted at The Brewery in London and I was lucky enough to attend with some of the Screaming Frog team. The ceremony celebrates the very best achievements in the search industry and we were delighted to win not just one, but two search awards, doubling our success from last year.
Our night started off well as we won the award for the ‘Best Use of Search – Finance’ alongside our client Moneybarn for our creative SEO campaign and subsequent results for the client.
Just when we thought the night couldn’t get any better we were announced as winners for the ‘Best Low Budget Campaign’ with our client The Solar Centre, for our festive content campaign.
We were thrilled with our double win, especially competing against such tough competition. Well done to all the winners and nominees. We’re looking forward to what 2020 will bring and hope to be making a return!
The post Screaming Frog Wins Double at the UK Search Awards appeared first on Screaming Frog.
0 notes
geeksperhour · 4 years
Text
Add custom fields to your ecommerce product page 
Does your ecommerce business sell products that involve displaying additional information or receiving product-related information from customers? Zoho Commerce’s new feature, Custom Fields, could be what you need.
What are Custom Fields?
Custom fields are product fields that can display or receive different types of information, including numbers, text, currencies, email IDs, check boxes, drop-down boxes, multi-select options, and auto-generating numbers.
You can use these fields to either display or receive data. You could use them to receive product-related info like a message to be printed on your product, or additional notes for product customizations. You can also include a check box to agree to terms and conditions, or a drop-down for an option to be selected.
To display data, you can include order-related info like a date or a text to be displayed to your customers.
How to implement custom fields in your online store
You can include custom fields in your online store by creating them from the Settings page. All custom fields are added to your products using a default layout.
What are layouts?
Layouts are groups of custom fields that you can add to your products. You can create your layouts in the Custom Fields section.
The default layout contains all the custom fields you’ve created for your products, but you can choose one containing only the custom fields that you want.
Conclusion
Use the custom fields feature on your online store whenever you need additional information for your products. Learn more about implementing custom fields here: https://help.zohocommerce.com/product-custom-fields#create_custom_fields
  from Zoho Blog https://ift.tt/35CJSuo via IFTTT
0 notes
geeksperhour · 4 years
Text
Introducing Cliq’s new logo!
If you’ve been using other Zoho apps, and know many of Zoho’s product logos are undergoing a design refresh, you probably saw this coming: Cliq has officially joined the Logolinism club!
We’re excited to reveal we’ve updated Cliq’s beloved chat bubble logo to this sleek, minimalistic new version:
But first, why change it?
The purpose of a logo is to reflect your brand’s purpose and to create a unique identity among the public. That being said, a logo should also reflect brand aesthetics while also staying simple and unique.
 Following the guidelines of Logolinism—Zoho’s new minimalistic logo style characterized by simple line art, bright colors, and simplified forms—we began to remodel our existing logo to match Zoho’s fresh new look.
 The story behind our logo:
When the idea of the new logo came into discussion, it didn’t really take us too long to decide on what direction to proceed. Our logo—a simple, clean design—was created to communicate very simple ideas:
The outlines match the outside curves, or brackets, formed by the letters “C” and “Q” in our product name, “CLIQ”.
The three dots in the center signify communication, collaboration, and productivity—Cliq’s purpose and our motto.
And of course, the overall representation of a message or chat bubble highlights Cliq’s capabilities as a messaging platform.
youtube
We carefully deliberated over the variety of designs our dedicated designers came up with, and in the end, we are pleased to present a new and improved logo that carries the essence of what we want to convey.
Our new logo embraces the minimalist style of using clean, brightly colored line art to form a simplified image in place of highly detailed or realistic icons. While this style follows the current trend of minimalism, it’s also easy to understand and recognize.
The best part of this new Logolinism style is that it works for everything! We’re pleased to join other Zoho products in this ongoing design refresh, making Cliq easily identifiable as part of the Zoho family. 
P.S. If you’d like to update to or use the new logo, we’re sharing our assets right here for you to use!
  from Zoho Blog https://ift.tt/2L0ykJO via IFTTT
0 notes
geeksperhour · 4 years
Text
5 ways small businesses can build customer trust using email marketing
Every business holds on to values such as honesty, loyalty, and respect, but one value that stands as the key to success for any business is gaining and retaining customer trust.
The digital revolution is screaming for business transparency and customer trust with the rapid expansion of online reviews, community forums, and more. This clearly signals that businesses should make special efforts to establish a good brand image and embrace customer trust.
DID YOU KNOW?
85% of consumers trust online reviews as much as personal recommendations
Trust takes time and consistent effort to sprout. Customers trust businesses that’ve been around a while because time has allowed those companies to establish their brand, grow their customer base, and recruit experienced employees.
Take the case of two businesses, Business A which has had a long term presence, and Business B, which is very new in the market.
So, how can this new small business quickly build enough trust to compete? They must use a smarter approach to winning customer trust.
Marketing to the rescue!
Well, the good news is the thorns on the roads of trust-building for small businesses can be removed using consistent communication and a smart marketing strategy.
DID YOU KNOW?
84% of marketers believe that building trust will be the primary focus for marketing efforts in the future.
The best marketing solution for small businesses is one that is easy-to-use, is budget-friendly, and provides a flexible platform for a consistent flow of communication.
All of this can be easily served on a single plate by using an email marketing tool, which also provides a great ROI.
DID YOU KNOW?
In 2018, the number of global e-mail users amounted to 3.8 billion and is set to grow to 4.4 billion users in 2023.
Five bricks of email marketing to build trust
There are numerous ways to use email marketing to build and retain customer trust. To get started, we’ll focus on five methods.
Ask before you shake hands
In the age of GDPR and other privacy requirements and consumer protections, it’s important to have your lead or customer’s consent before engaging them with communications or promotions. One easy way to gain their consent is by using sign-up forms.
Before this step, ensure you have a website that stands as the face of your business in the digital world. It should have engaging content with a CTA (call-to-action) at the end that will convert casual readers into potential customers.
If your site content feeds the need of the users, they’ll begin to show interest in your business and will get in touch with you voluntarily. This is the ideal situation. Readers who reach out to you on their own are the ones who have begun to trust you, but their trust at this stage is still very fragile.
Embedding a sign-up form on your website can invite readers to engage with you in a hassle-free way. Many email marketing tools offer numerous sign-up form templates from which you can choose.
Nurture and build
After your readers have reached out to you, the next step to build up their trust is to nurture them with well-planned and consistent email follow-ups. The amount of effort, planning, and care you present through consistent email nurturing will show that your business is serious and eager to serve them. This will strengthen the fragile trust and help convert them into a customer. However, the drawback for small businesses is the lack of time and resources to craft and send nurturing emails.
Fortunately, your email marketing tool should offer various automation workflow series that can easily help you automate these routine engagements within a short period of time.
Be human
While nurturing your contacts with regular emails, avoid sending them out in bulk to your entire contact list. Your contacts are living, breathing humans with unique interests of their own, so it’s natural for them to trust the business that best understands their unique needs and serves them accordingly. This sort of personalization is a key element to increasing conversions.
To do this, use analytics to understand the behavior of your contacts, segment your contact list based on their interests, and use personalization tags to send out uniquely tailored content. 
Feedback matters
After consistently sending out tailor-made content to those contacts who’ve reached out on their own, it’s now time to learn what they think about your business. This will tell your customers that you value their opinions about your offering.
The best time to request feedback is usually after customers have used your product or service at least once, by using your email marketing tool to collect valuable feedback.
Retain and grow
Now you have sowed the seed of trust and the plant has grown and produced sweet fruits. So what next? Will you just abandon them and forget about them forever? Well, obviously no, you will continue to take good care of them in order to retain that trust.
A great way to do this is by giving them exclusive promotions every now and then. Offer festival gifts, special coupons, and giveaways that ensure your customers understand their presence matters for your business. 
Customer trust is a must-have insurance policy for small businesses, so start building it and watch your business grow within a short period.
Best wishes!
from Zoho Blog https://ift.tt/33lQnAw via IFTTT
0 notes