What Is the Medic Algorithm Update? (Checklist At End of Post!)

What Is The Medic Algorithm Update?

What Was the Medic Algorithm Update?

Panda. Penguin. Fred. And now, ‘Medic’.

What’s with all the weird algorithm update names?

More importantly, what was the Medic update, which happened around August 1st 2018, all about?

Contents
[hide]

    “Traffic Is Down 50%!”

    It only takes a few hours of unusual ranking and traffic activity before all conversation turns to algorithm changes, especially when multiple domain owners and Search Engine Optimisation Experts all start asking the same questions at the same time.

    “Anyone else seeing big drops?”

    “Traffic is down 50% this morning. What gives?”

    “We’ve gone from ranking first across 100 big-money keywords to falling off the first page!”

    When those comments start to happen Search Engine Roundtable’s Barry Schwartz is often the first on the scene to gather the evidence from forums such as Black Hat World and Webmaster World.

    In a round-up post, Schwartz highlighted that algorithm activity had not only been high in the run up to August 1st, but was also exceedingly high on the day itself, regardless of the SERPs tracker used.

    Dr. Peter J. Meyers of Moz.com (affectionately known as Dr. Pete within the Moz community) followed up, with further confirmation from his own Mozcast search page and algorithm change tracker), that the algorithm had been very active on August 1st, resulting in one of the hottest days of change in 2018 so far.

     

     

    Mozcast temperatures during Medic algorithm update

    Image property of Dr. Peter J. Meyers and Moz.com

    Later on that day, Google’s Search Liaison — a PR role created by Google to offer webmasters more transparency about the way the search engine works — Danny Sullivan, updated his official Twitter account to confirm that “a broad core algorithm update” had been released that week.

     

    Sullivan went on to confirm that the update was similar to several other “broad core algorithm updates” that had occurred throughout 2018 (some without silly names ?).

    For many SEOs and website owners, the news that the algorithm change was “broad” has brought little succour to their justified feelings of despair surrounding their dwindling keyword rankings and traffic, which, for many, is more accurately measured in sales, not ranking positions.

    Who Was Most Affected by the Medic Update?

    When algorithm changes come, there can be a lot of early speculation. Is it a link-based algorithm change? Is it about mobile-friendliness? Are some websites getting penalised for being too advert-heavy? Or are some domains just plain unlucky and accidentally caught up in the mess?

    At Exposure Ninja, we don’t like to dive into algorithm change analysis too early. Instead, we prefer to wait a couple of weeks to gather complete pre and post-algorithm change ranking and traffic data to analyse. We do this for several reasons:

    Changes in ranking — whether positive or negative — in one week, are often reversed weeks later, when Google does a post-algorithm update tweak to fine-tune or correct early mistakes
    Traffic drops may only last a day or a few days, before returning to the average.

    However, there are some checks we can run through to be relatively sure that there aren’t non-algorithm related reasons for a sudden drop of traffic. These include:

    • Checking that the website is crawlable by search robots
    • Checking that the website hasn’t experienced a sudden spike in errors (such as 404 pages)
    • Checking that the traffic tracking code or software is functioning correctly
    • Checking that the rank tracking code or software is functioning correctly

    Each one of these allows us to know whether the website is in the fully accessible and fit state it needs to be in to rank well and receive traffic, but there are other necessary checks that can be performed (such as checking Google Search Console), which the brilliant Aleyda Solis has collected into a sensible step-by-step walkthrough process called “Why My Web Traffic Dropped? If you’ve seen a traffic or ranking drop recently, complete this checklist before doing anything else.

    Fortunately, there are several smart SEOs and marketing software companies out there that do have the necessary data on hand to conduct some immediate analysis into the shifting sands of Google’s search results pages, and, on August 3rd, the SEO toolbox and SERPs-tracking software company SISTRIX released a blog post summarising its findings.

    Health and Wellness Vertical Hit by Medic Update

    SISTRIX founder Johannes Beus used data gathered with his tool to highlight domains that had seen positive keyword visibility improvements in the UK and USA’s SERPs index, but, notably, it also identified that the sector, or vertical, that experienced the most activity was the Health sector.

    A screenshot of SISTRIX's Medic algorithm update visibility changes or the NHS' website

    Image property of Johannes Beus and SISTRIX GmbH
    In his post, Beus importantly highlighted a core aspect of what would later become known as the Medic update. In a section subtitled “Sensitive topics show movement”, Beus references Google’s own Google Quality Rater Guidelines (Cached version for future reference) — more on this and the following quote later in this post — as well as what the search engine giant refers to as Your Money, Your Life (YMYL) pages.

    Some types of pages could potentially impact the future happiness, health, financial stability, or safety of users. We call such pages “Your Money or Your Life” pages, or YMYL.

    What Are the Google Quality Rater Guidelines?

    Leaked as far back as 2008, but released publically in March 2013, the Search Quality Evaluator Guidelines is a 164-page (at the time of writing) set of guidelines for Search Quality Raters-in-training to read through as part of their education process.

    A Search Quality Rater (SQR) is a human contracted by Google to assess the results of given searches. Google’s search engineers use this system to better understand how users behave and interpret the results being returned during searches, especially after it makes an algorithm change.

    While in the past Google’s search engineers may have used SQRs to do A/B testing and behaviour reviews that would allow them to improve the algorithm further, in the opinion of this writer, the role of a SQR today is more verification that the multi-faceted machine learning-based search algorithm of 2018 is running as intended — for example, has the smart sandbox taught itself how to index content sufficiently for improved customer satisfaction (or did the searcher find what they were looking for first time?)

    If you’d like to learn about what a Search Quality Rater does and what they look at in the SERPs, consider taking the paid crash course by Search Evaluator Academy, (which also lists how to apply for the work-from-home role).

    Recent Changes to the Search Quality Evaluator Guidelines

    The Search Quality Evaluator Guidelines are usually updated on a yearly, or thereabouts, basis and, in July 2018, they were updated once more, as Beus highlights in his algorithm follow-up post three days later on August 6th.

    One of the main changes in the document relates to the so-called E-A-T considerations for content. E-A-T stands for Expertise, Authoritativeness and Trustworthiness.

    The changed paragraph is as follows:

    High E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation. High E-A-T medical advice or information should be written or produced in a professional style and should be edited, reviewed, and updated on a regular basis.”

    Change to Search Quality Guidelines on medical E-A-TExpertise. Authority. Trust.

    These three words have been on the minds of the most switched on SEOs, Copywriters, and Content Marketers for a long while, but what Google is expecting of publishers is that they show — to the uneducated — themselves to be an expert in their chosen subject, an authority in their vertical, and worthy of being trusted by external reviewers, peers and potential customers.

    The change in the paragraph — noted in bold above — adds further clarification that content in the health space must be written or produced by people or organisations with the authority to back up any claims made on the chosen subject matter, rather than the publishing and distribution of medical content that is neither reviewed nor approved by a certified person or body.

    Sidenote: Interestingly, the updated guidelines were published at some point around the middle of July (wonderfully captured and reviewed by Jennifer Slegg), a period when (between July 16th and July 22nd) other “broad core updates” were said to have occurred.

    Your Money or Your Life

    Google’s first priority is to provide its customers — in this instance, its search engine users — with correct and factual information, regardless of the severity of the search. However, when the search is of a severe and serious intent (such as a medical search), it wants to remove any potential for misinformation that could reach and threaten or endanger that user.

    Google defines pages that “could potentially impact the future happiness, health, financial stability, or safety of users” as “Your Money or Your Life” pages, and these include:

    • Shopping or financial transaction pages
    • Financial information pages
    • Medical information pages
    • Legal information pages
    • News articles or public/official information pages important for having an informed citizenry

    Added together, the importance of showcasing E-A-T for YMYL pages is compounded and, with the update to the guidelines coming not long before the update on August 1st, there’s *sufficient evidence to suggest these elements could be impacting the sudden ranking changes in the Health and Wellness sector.

    For a deeper explanation of what E-A-T is and why it’s so important, read our article “What Is E-A-T? (and Why It’s SO Important for E-A-T)“.

    Book cover of How To Get To The Top of Google 2021

    Get to the top of Google for free

    Download a free copy of our bestselling book,
    "How To Get To The Top of Google"

    The Medic Algorithm Update Earns Its Name

    On August 8th, Barry Schwartz published a post titled “Google Medic Update: Google’s Core Search Update Had Big Impact On Health/Medical Sites” and the Medic Update name was born.

    Following Schwartz’s analysis of 300 respondents to a survey he published hours earlier, the quantity of Health and Medical domains affected by the algorithm update is significantly large enough in size to suggest that the SERPs for this vertical in particular were being reviewed and adapted by Google.

    On the same day, algorithm analyst and SEO consultant Marie Haynes updated her own findings from the algorithm update to include the newly-coined name.

    Through her own investigation, Haynes affirms what others had been saying regarding the disruption to domains in the medical space and also focuses on how E-A-T and YMYL are about user safety.

    It looks like the Medic name is here to stay.

    Medical + eCommerce = Bad Times Ahead

    If you are the webmaster of a medical eCommerce site, then over the past few weeks you may have been having a harder time earning ranking and traffic than most.

    Hayne’s not only uncovered Health and Wellness domains affected, but also eCommerce websites that had been significantly affected by the update, presumably resulting in a large drop in unit sales.

    The deduced conclusion was that some medical eCommerce domains that are selling units lack the expertise, authority or trust to continue ranking. These domains don’t have the onsite or offsite content, or the quality backlinks, required for Google to determine whether the retailer can and should be trusted.

    For example, our immediate reaction to new medical discoveries is often (but not always) one of caution and mistrust. Unless the information or technology has come from someone that we trust, such as your GP, then you’ll likely be hesitant to give it a try.

    By rolling out the Medic update, Google has decided to reduce the visibility of domains that are selling services or products it deems unverified by the necessary bodies — be that official government institutions, medical associations, charities for illness/disease/medical conditions, or external reviewers (including customers) — or that are publishing content written by unqualified professionals.

    Further Confirmation

    Over the days following the algorithm change, more of the Search Engine Optimisation community’s veterans published their own findings and opinions, including Dr Pete, who used data gathered through Mozcast to determine that, while the Medical vertical had seen the largest amount of activity and changes in keyword ranking, other verticals were also affected, including other sectors categorised as YMYL, such as Law, Shopping and, to a lesser extent, Finance.

     

    Image property of Dr. Peter J. Meyers and Moz.com
    Joy Hawkins of Sterling Sky also applied her Local SEO expertise to the analysis of the Medic update to highlight that it had affected both Organic and Local SEO ranking (ranking within the Map pack).

    Industry veteran Glenn Gabe contributed some incredibly insightful findings on August 9th.

    In Gabe’s article, the now familiar highlights of Medical, Health, and Wellness domains were showcased and their ranking and traffic screenshot for collective review, but Gabe took his analysis further and speculated that the algorithm update was connected to more than just E-A-T.

    Content Quality

    Not only is it important that a domain demonstrates its expertise, authoritativeness and trustworthiness, but it must also ensure that the content’s quality matches the level of required E-A-T and the intent of the user’s navigation to that page.

    Considerable time is spent on creating content. Often, it’s for a strongly-defined purpose, but there are other times it’s done only to provide more content for Google to crawl, rather than to provide anything truly useful to an end user. Thousands of words are written, sometimes off the back of a lot of research and compiling of information from other articles on the subject, but, frequently, they are only compiled together to rank for a phrase or set of keywords.

    Content should be written for intent, not for ranking. Write for users, not robots.

     

    Quality is also measured as a whole, rather than on a page-by-page system. If a website has ten pages of high-quality content, but eighty of moderate-to-poor quality, Google is going to take that into consideration when making its ranking calculation for the entire domain.

    If your website isn’t always at its best across the board, your ranking might not be either.

    Content Accessibility

    Quality also extends to the quality and method of access to the information. Consider the following:

    • Is your website easy to use on mobile?
    • Is your website easy to navigate?
    • Is the main content easy to read?
    • Are there too many ads interrupting the reading and flow of content?
    • Does the page take too long to read?
    • Is your content original?

    If the most important part of your website — the content — is difficult or too irritating to read, then your users aren’t going to enjoy accessing it; and if there’s one thing that Google hates, it’s users have a bad experience on websites it recommends.

    If you’re not sure about the experience of your website, ask.

    Ask your website’s visitors and most trusted customers for their feedback. If anyone is going to give you feedback on how to make your website easier to use, it’s your user’s who know best.

    Don’t have any users to ask? Use a user testing platform, such as Usability Hub, to run an array of different tests and surveys.

    Never trust your own judgement or experience with your website. You’ll suffer the Curse of Knowledge and may not spot something crucial that’s affecting usability, because you’re so used to your own website.

    Website Health

    In addition to Gabe’s additional checks for content quality and accessibility, he also performed sample audits on a number of the reviewed domains.

    During his research, he found that some negatively hit websites were also likely to have poor website health, from technical SEO errors (crawl errors, crawl budget issues, redirects, page load times, etc.) to an excess of thin pages (low word counts) and low-value user-generated content available to be crawled and indexed by Google.

    As with content accessibility, Google prefers not to send users to websites with errors and below-par technical health, as they fear that the user experience will be negative. This would result in reduced trust in Google’s ability to not only surface the right information, but also easily obtain it (this is why voice search is taking off — Google can control the delivery of the information entirely).

    To perform your own sample audits, we recommend using the following tools:

    Example of SEMrush's Site Audit tool

    Our Own Research

    We love and highly appreciate the fact-finding that SEOs we idolise have carried out since the Medic update started to roll out at the beginning of August, but we believe that, if you want to really be sure about something, it’s best to Go, Look, See for ourselves (which is the act of going and finding out first hand).

    As the community had firmly deduced that the Medical, Health, and Wellness sector had seen the largest ranking changes, we settled on investigating this sector alone.

    From the outset, our methodology was:

    • To crowdsource ten health-based search queries
    • To compare the results on Page 1 of the SERPs in August against those in the last week of July
    • To review the authors of each content piece for signs of E-A-T
    • To review the reviewers and/or editors of those content pieces for E-A-T (where possible)
    • To review the publisher’s E-A-T (where no author for a page’s content was present)
    • To review the external Backlinks and Referring Domains of those URLs
    • To review the Trust Flow and Citation Flow of those URLs

    We opted against carrying out site audits for these domains, due to time and resource limitations.

    We decided to follow a manual process to visually verify the information, rather than relying entirely on automated software solutions.

    Phrases We Searched For

    In order to have a varied mix of phrases to analyse, we opted to internally crowdsource a range of phrases to choose from. This was accomplished by asking our Ninjas to submit (to an anonymous form) any medical phrase they could remember searching for in the weeks leading up to the update.

    From the thirty-one searches returned, we chose the following ten based on their search frequency. We also tried to select as varied a group as possible, from high importance/priority/risk (stroke) to lower important/priority/risk (vegan diet)

    • how to lose weight fast
    • benefits of apple cider vinegar
    • vegan diet
    • signs of stroke
    • cbt therapy
    • crohns disease symptoms
    • signs of dementia
    • concussion treatment
    • tingling in left arm and leg
    • side effects of stopping fluoxetine

    Screenshot of our Medic algorithm update checklist

    What We Found and Observed

    During our review of pre and post-Medic update SERPs results, we found and confirmed several of the findings of our industry colleagues, including the immediate and drastic ranking declines of draxe.com (review on SEMrush), prevention.com (SEMrush), and patient.info (SEMrush).

    More importantly, we noticed several commonalities between the pages that held their ranking from July to August and those that gained rank after the Medic update.

    1. The Update Was Not Isolated to Either Desktop or Mobile

    It doesn’t matter if you positively benefited from the Medic update or were hit negatively. If you suffered on one device, then you were affected on the other, which supports the theory that the Medic update was about E-A-T and quality, rather than device accessibility (we can assert that ranking changes were not related to the mobile-first index change).

    2. Official Medical Bodies and Associations Performed Strongly

    Official medical bodies and associations with well-crafted content performed strongly and either held their high positions in Google’s search results or climbed from the middle of page one to closer to the top.

    The most notable winner was the United Kingdom’s National Health Service (NHS) digital team, which increased its potential traffic to the tune of millions of visits. Unfortunately, this did lead to some ranking loses for the Scottish NHS and Welsh NHS websites (the latter of which needs to add an SSL certificate).

    Other medical bodies that experienced visibility improvements included the Royal College of Psychiatrists, but the domains to really solidify their claims on the UK’s search results were the US’s Mayo Clinic, National Alliance on Mental Illness, and MedlinePlus.

    With many Stateside domains ranking in the UK’s search results, it begs the question: are domains in the UK not offering the right information with enough E-A-T, or is it that the UK has only one major medical body (the NHS), rather than a large split, leading to other domains opting to link to the NHS as the one and only relevant online resource?

    Also seeing further improvements was the BMJ, a medical journal, which should have no trouble showing its expertise, authority and trust.

    3. Medical Charities Performed Strongly

    Medical charities in the UK retained their previous ranking positions from July or benefitted from moderate to strong ranking improvements.

    Some of the biggest winners included the registered mental health charity Mind and the Alzheimer’s Society.

    Other charities that also held their own during the shaky period or saw a recovery (following earlier ranking drops) included Diabetes UK, the Stroke Association, the Mental Health Foundation, the Vegan Society, and Dementia UK.

    4. Content by Qualified Writers Performed Strongly

    Content created by qualified writers also performed well, although, there are important caveats to acknowledge about how the qualifications of the writer are highlighted.

    Healthline.com was the standout performer, having the majority of its writers not only mentioned at the top of the page, but also linking to micro biographies that explain their background, qualifications and experience.

    As an example, an article by Healthline on the benefits of Apple Cider Vineger, written by Kris Gunnars (who has a Bachelor’s Degree in Medicine), links to the website’s About page, which lists its management team, along with its collective of licensed nutritionists and dietitians.

    Additionally, a clickable “Evidence Based” button opens a modal popup with the following clarification:

    This article is based on scientific evidence, written by experts and fact checked by experts.

    Our team of licensed nutritionists and dietitians strive to be objective, unbiased, honest and to present both sides of the argument.

    This article contains scientific references. The numbers in the parentheses (1, 2, 3) are clickable links to peer-reviewed scientific papers.

    In the case of Healthline, E-A-T is thoroughly demonstrated for its entire content team, leaving little margin for criticism of the content.

    Screenshot of the Healthline nutritional authors page

    5. Content with Qualified Editors and Peer Reviewers Performed Strongly

    In cases where the author’s profile was not or could not be linked — presumably for legacy content published before author profiles were being compiled — domains that have had its content reviewed by a peer in the medical or scientific community (to establish credibility of the content) also received ranking benefits.

    Healthline was again the best practitioner and biggest beneficiary of this method, with this article on the symptoms of a TIA/ministroke being peer reviewed by Dr. Seunggu Han, part of a medical team comprising more than fifty highly-qualified peer reviewers.

    The link for Dr. Han’s name navigates to the Medical Team page, which lists all of the members, along with their biographies (which includes information about their qualifications and experience).

    Another website that saw great ranking improvements and used the peer review or qualified editor method was WebMD, a website that, while regularly being the butt of many online jokes, has been equally smart regarding its content creation and approval process.

    Where authors are not named, a reviewer is instead listed.

    An example is WebMD’s article on antidepressant withdrawal, which lists Joseph F. Goldberg, MD as the reviewer. Clicking his name links through to his profile, which demonstrates his qualifications, experience, where he works and what he does — a fantastic demonstration of expertise, authority and trust, once again.

    The Mayo Clinic medical practice’s website also uses a similar system of reviewing content where the writer isn’t always listed (see this example on Crohn’s Disease), but, instead of linking directly to the editorial team, it first links to the website’s About page, which then has a link to their Meet Our Medical Editors page. Here, individual editors are listed (including their qualifications), each with a link to their own individual profiles (example).

    Screenshot of an author page on WebMD

    6. Content without Qualified Writers or Peer Reviewers Declined

    Not everybody won during the Medic update.

    If your website operates in the medical sector and you don’t list your (or your writer’s) qualifications to write about and discuss a medical subject, or you promote related services or products for it, then it’s highly likely that you saw ranking drops around August 1st.

    One website that saw dramatic drops in visibility was health, beauty and wellness publisher Prevention.

    Screenshot of estimated traffic for prevention.com after Medic algorithm update via SEMrushWhile reviewing some of the articles that suffered visibility drops for some of the search terms we analysed, we found dissimilarities between Prevention and domains that had seen ranking improvements.

    In this article on easing lower back pain, the author name is stated and acts as anchor text, but the linked to page is simply a list of articles written by that same author. The page offers no biography or evidence that gives visitors — or Google — grounds to believe that the advice given or claims made are trustworthy.

    Google’s Search Quality Guidelines asks raters to double check the reputation of the domain and the content creator. In a section entitled “Research on the Reputation of the Website or Creator of the Main Content”, it gives guidance on “How to Search for Reputation Information”, which includes the following (hat-tip to Jennifer Slegg for her analysis of the most recent guideline changes:

    • “For content creators, try searching for their name or alias”
    • “For content creators, look for biographical data and other sources that are not written by the individual.”

    Following the same process for our example uncovers the author’s personal website, plus dozens of other websites she has written content for in the past, including; Yoga Journal, Women’s Day, Health.com and Sonima.

    Of the four domains reviewed (there were several more), only Sonima provided a biography and outbound links to Rabbitt’s social media channels (this offers good proof that the author is real, rather than a pseudonym).

    We also found Rabbitt’s LinkedIn profile, which shows a long CV and experience as a writer and editor for several health and wellness entities. Notably, however, there is a lack of qualifications that would make her qualified to write about stem cells as a beauty ingredient, delayed sleep-phase syndrome as a sleep problem, or whether dark chocolate helps your heart.

    In no way is this a criticism of Rabbitt as a writer or the quality of her writing, nor is it a criticism of any other writer who writes across multiple subjects or domains. However, it is an acknowledgement that, as content producers, we can’t be a jack-of-all-trades and master of none. Not only is Google expecting us to showcase our expertise and authority, but it also wants to be able to trust us enough to send us users. And, at the end of the day, user trust is the only thing any business owner should care about.

    Unsurprisingly, Women’s Health Mag — another Hearst Digital Media-owned publisher — had the same issues with author attribution and authority, with accomplished journalists who are equipped to write about their experiences with a vegan diet and for high-profile publishers like Time, Cosmopolitan and The Atlantic, but are under-qualified to write deeply enough about veganism to rank for the related search term.

    Again, the author is a very qualified journalist, but not proficient enough to write about such a range of subjects that includes health conditions that can be passed through the genes, for example.

    Example of an author profile page on Prevention

    7. Content without Noted Writers or Editors, or with Ambiguous Writer Names, Declined

    If you can’t own up to doing the deed, don’t do the deed at all.

    That’s how Google feels about content it can’t verify easily — we’ll come onto how it may be verifying content shortly — whether that’s the content itself or information elsewhere on the website.

    For Counselling Directory in the UK, not listing the author of this piece of content on cognitive behavioural therapy (CBT) may explain why its rankings have dropped steeply since the E-A-T focused “broad core update” happened in March.

    While there are links to articles that do list the author of the piece, as well as a link to its very in-depth profiles (from which you can contact counsellors to book a session), the important guides don’t list who created, edited or peer reviewed the medical advice.

    There’s not even a mention of any real people on its About page, nor on its Contact page.

    If you were in a position or mental state where you needed advice on a topic such as CBT, would you trust a faceless website or business to provide you with the secure help you need?

    Another victim of the Medic update and a culprit of not attributing its medical, health and wellness content to a qualified author (or reviewer) is the UK-based health food chain Holland and Barrett.

    Regarding its previously ranked content on the benefits of apple cider vineger, who’s to know whether it’s reliable? There’s no author; nobody claiming the content as their own who can be trusted.

    Would you trust health advice coming from a faceless person? Unfortunately, this is exactly what Holland and Barrett expects you to do.

    Holland and Barrett is a great national brand that has helped people across the UK gain access to food, cosmetics and many health products they otherwise may never have heard of, but who’s giving the advice? There’s not a single face on its About page, nor is there a list showing off its content team and their respective qualifications.

    Side-note: If you’re reading this, Holland and Barrett, give us a call. ?

    8. Good Content + Bad UX = Bye-Bye, Rankings

    There are a few domains I could share that tread the thin ground between placing some advertisements and full-blown, obnoxious ad placements that make the main content (almost) impossible to read.

    Prevention and its sister publications are one nominee, but the standout was eMedicineHealth.

    eMedicineHealth’s long-form content on concussion.

    While it didn’t witness massive drops during the Medic update — which we attribute to it mentioning and linking to its qualified medical author and medical editor — it did experience an approximate 5% loss in potential traffic from its estimated one million visitors. This equates to 50,000 visitors — hardly a total to be taken lightly.

    Given how claustrophobic the website feels, with ads on either side of the content compressing the main body copy into a tiny slither of text, it is entirely possible that the website suffered from a ranking change based on the accessibility of the main body content.

    Screenshot of display ads on emedicinehealth.com

    9. It’s Not Always Immediately Obvious

    Among the casualties (haha, get it? CasualtiesNevermind ?) was US-based Dr Axe, a “Doctor of Natural Medicine” (a heavily-debated title) who promotes natural medicine and (coincidentally, perhaps) has plenty of courses and products available to buy.

    Searches around Josh Axe and the Dr Axe brand revolve around questioning its legitimacy, with some individuals even going as far as to update the pseudoscience-analysing RationalWiki page for Dr Axe using the terms “quack” and “clinically unproven”.

    The articles and guides on the Dr Axe website are impressive. They’re frequently long-form, include multimedia and have plenty of “social proof” (more than 25,000 shares on Facebook at the time of writing). They also use qualified authors — such as for this linked article, the author of which has a Master’s Degree in Clinical Nutrition and a Bachelor’s in Dietetics — but the website was hit hard by the Medic update.

    There’s no onsite validation or background information about the author beyond their listed qualifications which, interestingly, has been largely visible on the domains that have benefited from the Medic update. Why the backlash from Google, then? There’s a significant possibility that the distrust and negative sentiment around the Dr Axe brand has resulted in its health advice and the products it sells not being trusted by Google, with the search engine not confident that the information is trustworthy enough to provide users with the best possible experience.

    Interestingly, Google doesn’t seem to have limited the Google Ads abilities of the domain, but it did appear to reduce its Google Shopping visibility after August 1st.

    SEMrush estimated traffic for draxe.com after Medic update

    Yet another website that lists the author of its content, but doesn’t include a biography and was hit by the Medic update was Patient.info.

    Its content on Crohn’s Disease is well-put together and very accessible to read, but it suffered more than a 50% loss in visibility after August 1st.

    There are ads alongside the content, but, compared to other domains, it feels much cleaner and is easier to read.

    The main problem we see with this domain is that, while the author is highly qualified and the content appears to have been reviewed by another equally qualified Doctor, the author’s bio (which we found via a Google search) is not linked. This makes it much harder for Google to easily establish E-A-T.

    Screenshot of estimated traffic for patient.info from SEMrush

    10. It Has Nothing to Do with Links

    Links. The bedrock of what it takes to rank a website. We all need them to prove that we’re worthy of ranking, right? Absolutely true, but not always.

    I compared the link totals at both the URL level and the root domain level, and found that the total number of links had no correlation to ranking shifts. Domains with fewer than 5,000 total links pointing to them were ranking above others with 100,000, 200,000, or even 300,000 root domain links.

    Even if a URL had fewer links at the URL level than another with several hundred more, it made no difference. What stood out more was that the URL with so few inbound links had more authority to write and rank above the other domains because it was a charity or medical body with the specialised credibility to do so.

    Movement Anomalies

    One of the most peculiar anomalies we came across, where the ranking decrease wasn’t entirely obvious, was on the VeryWellHealth domain, which came up during our search for crohns disease symptoms.

    The health publisher, owned by Dotdash (formerly About.com), experienced ranking decreases immediately after August 1st.

    Its ranking article on bowel disease has all the necessary trademarks that Healthline has, including a link to the author’s bio (which contains paragraphs demonstrating “Experience” and “Education” and outbound links to the author’s social media profiles). On some domains, this has been enough to sufficiently demonstrate E-A-T, but, for some reason, the domain has suffered in the SERPs.

    Could it be that a copywriter with a Bachelor’s in Environmental Science isn’t good enough for Google’s stringent E-A-T requirements?

    We’re guessing not. There’s plenty of content elsewhere on the site that demonstrates this isn’t the case, such as 8 Signs You’re an Introvert, which was written by an author with a BSC in Psychology and peer reviewed by a Professor of Psychiatry at Harvard Medical School.

    Even stranger is that its sister site, VeryWellMind, saw steady visibility increases, despite the fact that the collective of authors and peer review organisation is exactly the same.

    Book cover of How To Get To The Top of Google 2021

    Get to the top of Google for free

    Download a free copy of our bestselling book,
    "How To Get To The Top of Google"

    Some Domains Have Regained Their Ranking Days Later

    Not every domain that witnessed ranking decreases on the dates surrounding August 1st was unable to recover.

    While we expect that some domains are going to have to wait a while before the next large E-A-T update comes along (there are plenty of frequent “broad core updates”, but not every update is as big as this), we have spotted a few domains that lost more than 50% of their visibility, only for them to rapidly bounce back just a few days later.

    Screenshot of estimated traffic for DashDiet after the Medic updateDukanDiet Screenshot of estimated traffic for after the Medic updateScreenshot of estimated traffic for VeryWellHealth after the Medic update

    Why Did Some Domains Perform so Well?

    Following our own verification and the analysis of highly-respected and experienced industry veterans, we came to the conclusion that the primary reason why some domains performed so well is because they exhibited sufficient or strong onsite evidence of Expertise, Authority and Trust.

    We discovered that the majority of domains that either increased their market share of the SERPs or maintained visibility during this period of destabilisation tended to do the following:

    • Used specialised or qualified writers with an expertise in the subject matter (either broad or on a finer macro level)
    • Used specialised or qualified editors or peer reviewers (again, with a broad or macro level of expertise)
    • Included a biography of the writer, editor, or peer reviewer on the page
    • Linked to a separate biography page where no on-page bio was provided
    • Linked from biographies to identity or expertise-proving URLs

    The ranking content was mostly long-form content too, backing up Glenn Gabe’s comments and findings on thin pages, UX, and site health. However, where some content was thinner, it was complemented with multimedia that extended or improved accessibility for users.

    Why Did Some Domains Perform Badly?

    Domains that suffered from the algorithm update, put simply, lacked an onsite demonstration of E-A-T.

    Pages or domains that lost visibility or keyword market share during the Medic update lacked expertise, authority or trust at the domain level (i.e. if the business or business owner wasn’t reliable), used authors with little to no subject expertise or sufficient qualifications for the severity of the subject matter, relied on anonymous content creators throughout the site or had anonymous website/business owners, or had content that was difficult to access or digest easily.

    The harder it is for Google to find proof that the information on a website can be trusted or accessed properly by the users it sends — either via Desktop or Mobile — the more it will refrain from ranking your site, to avoid putting its users at any risk.

    Publishers and eCommerce sites that attempt to promote or sell services or products that may be deemed questionable (ie. “fads”, “gimmicks” and trends) will also need to have sufficient credible and supported evidence to prove their value to a person’s health or wellbeing.

    Websites with excessively thin content were also likely to see ranking and traffic loses during the Medic update, as were domains that depend heavily on UGC (User-Generated Content) to rank.

    Speculation: Could Google Be Automating Author Validation?

    During the course of our research, our biggest question was: “How does Google determine E-A-T?” and, by reviewing the biggest winners of the Medic update, we found that the recurring theme — that onsite proof, such as qualifications and experience, is present — plays a big part, but is that all Google could be doing to determine the E-A-T of authors?

    What if Google is not determining the authority of a content creator on one website, but instead is doing it across many websites?

    The core of Google’s algorithm during the infant years of its life was built upon the simple principles that “one can simply think of every link as being like an academic citation” and that “PageRank provides a more sophisticated method for doing citation counting” (as found in Larry Page’s original PageRank draft; but after the Panda and Penguin updates were dropped and eventually put into real-time deployment, link totals were less important — but still high on the list of critical ranking factors — than the relevance of the linking domain; the quality of the content; the accessibility of it, and so on. Surely, given the sheer amount of new ideas and research into algorithmic qualification and the sorting of content in the twenty years since PageRank was first proposed as an idea in 1998, there could be a method for parsing and interpreting content that we’re not currently considering or speculating enough on?

    Screenshot from the PageRank draft proposal by Larry Page

    During the first few days following the Medic update, I was already starting to formulate the idea that Google’s algorithm is looking across multiple domains to determine the E-A-T of content producers. I guesstimated that, given the advances in machine learning and Artificial Intelligence (AI), Google could have determined a method for crawling content and determining who the author was by the subject of the copy, the style of writing, the grammar usage, the choice of words, and the types of links used (i.e. always citing the same links would flag the copy or author).

    I also considered that Google may include sentiment tracking as part of the algorithm too, primarily for businesses and brands, but potentially for authors too.

    Lacking the brain power to understand pretty much anything about machine learning, AI and general computer science, I took to Twitter, Reddit and trusted Slack channels to think out loud. I messaged people I respect and left comments in places where I hoped someone would be able to steer me in the right direction (including Reddit’s Machine Learning community).

    I received some great replies, including the following tweet from Google patent analyst and all-round genius Bill Slawski, which led me down a (deep, but fruitful) rabbit hole of information.

     

    Entity Sentiment in Ranking

    Not only would Slawski link me (in a later tweet) to a list of fifty patents that referenced “sentiment” within them (which you can view here), but he also linked me to a granted patent, titled “Sentiment detection as a ranking signal for reviewable entities“.

    The patent — filed by Google employees Sasha Blair-Goldensohn, Kerry Hannah and Ryan McDonald (who’s work at Google’s state-of-the-art AI research centre can be reviewed here) — includes in its abstract a proposal for “a method, a system and a computer program product for ranking reviewable entities based on sentiment expressed about the entities.” The entities, in this case, would be web pages or domains.

    The patent application is way beyond my level of intelligence, but, from my understanding of the abstract, it certainly seems that Google could be tracking the sentiment of brands and authors. There are also many other highly useful and frequently cited papers on sentiment analysis, including this study by Mika et al, which we recommend checking out (if you’re into that kind of thing).

    Gif showing the swing from happy to sad in client testimonials

    Image property of TEMBOO

    Author Identification through NLP

    Digging deeper into the recommendations and explanations I’d received — many thanks to the BigSEO community for their support, especially IBM’s Technical SEO wizard Patrick Stox, who gave my theory a boost when he said: “[Google] confirmed before that authorship died because they didn’t need it anymore, meaning they probably trained the system well enough to recognize the different folks” — I was led towards the complicated field of Natural Language Processing.

    Natural Language Processing is very complex and you may find that some SEOs or content creators use offshoots including Latent Semantic Indexing and TF-IDF in their work. But, put simply, it’s a field in which computers are used to process human language and reach an expected outcome (stipulated by a computer scientist — or keen home enthusiast), such as discovering the meaning of a sentence.

    It’s a field that Google’s AI research team will have spent years working on and, more than likely, continues to, as it hires the very best of each year’s Computer Science graduates to deepen its knowledge and the potential wider uses of the science it discovers. One of those discoveries, which found its way into public life, is called Cloud Natural Language.

    The Cloud Natural Language API analyses text to find the entities and the sentiment in the words and sentences surrounding it. It can also determine subject categories for the text provided, as well as the syntax (or arrangement of words) of it. There’s a small free test version you can use on its page. I highly recommend you try it out.

    Example of Google's Natural Language API

    Other companies have also been working on NLP, including Dandelion API, which has a similar tool that lets you paste either the copy or the URL link of the page you’d like to analyse. For an example, see this analysis of Healthline’s “Beginner’s Guide to the Vegan Diet”.

    Side note: For a fantastic introduction to NLP for SEO’s, I’d highly recommend the On-page SEO for NLP guide by Justin Briggs.

    In 2006, Rong Zheng, Jiexun Li, Hsinchun Chen and Zan Huang published a paper titled “A Framework for Authorship Identification of Online Messages: Writing-Style Features and Classification Techniques“, which has been cited more than 600 times. In the paper’s abstract, Zheng explains the research carried out, which included the creation of a framework that could calculate and trace the identity of a writer based upon the semantic analysis approach and the types of entities found in both NLP and Sentiment Analysis.

    We developed a framework for authorship identification of online messages to address the identity-tracing problem. In this framework, four types of writing-style features (lexical, syntactic, structural, and content-specific features) are extracted and inductive learning algorithms are used to build feature-based classification models to identify authorship of online messages.

    The potential couldn’t be clearer:

    “Inductive learning algorithms are used to build feature-based classification models to identify authorship of online messages”. We can interpret this to mean the crawling of text and determining the author based upon recurring, telltale characteristics.

    The deeper I dived, the more supportive documentation I found. Again, I’m no computer scientist (in fact, I didn’t even complete college), but, even with my minor detective skills, it seems that it could be a very real possibility that Google might be determining the author of content and using its data to compute a trust level in that creator’s knowledge and reliability to disseminate knowledge or facts about a particular subject matter.

    I even found a paper by Stanford University graduate (and now Software Engineer for Google!) Liuyu Zhou on the subject of “News Authorship Identification with Deep Learning”.

    One semi-supportive paper I found, which brought an immediate smile to my face, was the “Stance Classification in Texts from Blogs on the 2016 British Referendum”, which, unfortunately, I wasn’t able to find a free readable copy of.

    Finally, the last piece of the jigsaw that ultimately convinced me that there is a very strong chance* that Google is identifying authors of content across domains, was a blog post by John Bohannon, Director of Science at the San Francisco-based machine intelligence company Primer.

    In the blog post, “Machine-Generated Knowledge Bases”, Bohannon discusses how Primer discovered 40,000 computer scientists without a profile on Wikipedia who were just as qualified as other scientists who did have a profile written about them.

    By feeding scientific papers, news, and data from Wikipedia into its internally designed machine learning system, called Quicksilver, the Primer team was able to not only determine the names of frequently named people or authors who regularly created content on a specific niche field, but it also learned to generate new biographies for those people.

    Screenshot of the computer scientist biographies created by Quicksilver machine learning program

    Image property of Primer
    Bohannon was inspired by Peter et al.’s “Generating Wikipedia by Summarizing Long Sequences” study to train Quicksilver to create individual profiles (which you can see here), including a biography for Arun J. Sanyal, MD, a Professor of Medicine who was mentioned 396 times across 43 documents.

    This allows us to see machine learning and artificial intelligence not only being applied to text analysis but now also creating its own content.

    Coming to a Conclusion

    Before starting any of the research and reading discussed above, I was skeptical that my own theory could be true: that Google is building up scores for expertise, authority and trust for brands and authors during its crawl of the internet; but, after gathering all of the evidence, I can’t deny that there’s a very strong possibility that Google retired Authorship because it taught its own algorithms to not only learn about content creators algorithmically but also to determine a process and method of evaluating those authors.

    There’s nothing I’d like more than for someone to read through the evidence and further patents and papers that I’ve yet to discover and digest, but, in the opinion of this naive and excitable author, it’s very possible that Google is analysing this very sentence and building up a trust level for me right now (I hope I’ve given it enough reason to score me highly).

    Is there a scoring system for each author? Maybe. Is it 0-10 for Expertise, 0-10 for Authority, and 0-10 for Trust? Probably not.

    While you and I may make educated or hopeful guesstimations on how an algorithm as complicated as this may work, without stepping behind the wizard’s curtain, we can never really be sure. Instead, each content creator, publisher and website owner should endeavour to create content with the intention of getting 10/10 scores across the board, regardless if there is a scoring system or not.

    How can that be done? Read on, dear reader.

    Note: Since writing the first draft of this section, noteworthy SEO veteran and expert on Google’s defunct Authorship, Mark Traphagen, published a guest post on Search Engine Journal, which discusses how Google may be determining author authority, the retired method for attributing authorship to articles and the establishment of an author’s authority across domains.

    In his post, Traphagen also speculates that the current iteration of Google’s algorithm must include some form of author attribution system using machine learning, suggesting that “Google Authorship served as training data for any future moves by Google to include author authority in its search algorithms.”

    How to Demonstrate E-A-T on Your Website

    While the Medic update has affected the Health vertical the most, we expect that every vertical is either being reviewed in the same manner or can expect to be, as the broad core updates are gradually rolled out throughout the remainder of 2018 and beyond.

    Note: Before creating any content, always determine where in the AIDA (Awareness, Interest, Desire, and Action) funnel the intent of the user’s search fits. Write for user intent, not for robots and search engines, and always write with empathy for the user’s situation to understand the reasoning behind the search. Put yourself in their shoes.
    Diagram showing the AIDA Funnel, a journey from Awareness, to Interest, Decision, and Action

    Tips for Publishers and Businesses

    1. Start by reading the guidelines

    Google is already giving us all of the information we need to know in the Search Quality Rater Guidelines.

    You can download or read a copy here or (just in case they remove it) you can read an archived copy on Archive.org.

    Set aside some time during the week, or at the weekend, to read through them with a pen in one hand and a notepad in the other. Be highly critical of your own website, and how it is organised and run by your content and editorial team, even if your team consists of only one person — you!

    If you’re not sure about the accessibility aspects of your website, use a user testing platform, such as UsabilityHub and conduct any and all tests you can afford to. Don’t trust your own impressions of your website.

    Additionally, if you’re in the medical sector, you may benefit from spending some additional time reading the NHS’ Information Standard Principles, which discuss designing your “product” (or, in this case, content) and the importance of verifying information.

    2. Demonstrate your expertise, authority and trust

    The guidelines make it clear that domains or search queries that relate to “Your Money or Your Life” (YMYL) are under the highest scrutiny, but we at Exposure Ninja would suggest that you treat every sector in the same way.

    It doesn’t matter if you’re offering a low-cost, low-threat product or service; from this point onwards you should prioritise showcasing why you are the biggest expert in your industry, the authority who everyone else goes to, and the person or brand which everyone trusts no matter what.

    Demonstrate your E-A-T on your website, but also demonstrate it offsite and offline. Just as you trust your friends to recommend a plumber you can have full faith in, you need to ensure that people online and offline are vouching for you.

    There are dozens of ways to do this and, while this includes encouraging people to link to you or leave reviews and testimonials about you online, it also extends to what people think and say about you offline. Remember that people are still people when they’re not staring at their phones or laptops. They have opinions about people and brands while they’re sitting waiting for the bus and they don’t hesitate to share them with the people around them, whether that’s the other people at the bus stop or their close circle of friends who’ll boycott a brand to demonstrate group unity.

    3. Use specialised writers

    If you’re hoping to gain a user’s — and Google’s — trust, you’ll have to use writers who are experts on your given subject matter.

    If you’re in the medical sector, look for and secure the services of writers who have qualifications and experience either administering the advice and services you’re writing about, or a demonstrable history of conducting research in that field. If these first two options aren’t viable, look for a writer with the most ideal qualifications possible.

    If you work in the finance sector, hire writers who are qualified and experienced. If you cannot secure these, invest in the one-off cost of bringing in a Content Marketing Consultant who can either train your staff to be better writers or help to develop an evergreen training system that each of your employees can go through to learn how to create their own content and establish their personal expertise, authority and trust.

    These rules apply to all industries.

    When it comes down to it, it’s simple: hire great writers or train your employees to become better content creators.

    4. Use specialised editors and reviewers

    When using a specialised writer or team of writers is not possible, having a senior and specialised editor and/or peer reviewer will add some credibility to the content that it is factual and not liable to mislead a user or put them in a position of being exploited or brought any harm.

    There are a number of websites for hiring specialised freelance writers, editors and peer reviewers, but you can also use a specialist service, such as Kolabtree, which allows you to hire freelance scientists and researchers from all over the world.

    5. Be cautious with guest content

    Guest content can be extremely helpful when you’re trying to add content to your website or fill in the gaps in your editorial calendar, but with your content seemingly under much stronger scrutiny than ever before, allowing just anyone to publish content on your website is risky business.

    We’re not saying you should stop using guest content; it’s far too useful to do that, but we are saying that you should audit the author before you publish anything on your website.

    Ask them what their experience is in the area and what their qualifications are, and ask them for evidence from other trusted domains (including LinkedIn) that establishes their expertise with concrete certainty.

    Before publishing anything, always ask yourself: “If I was a user of my website, how would this content improve or change my opinion of the website or business owner?”

    You probably wouldn’t trust a barista to cut your hair, so don’t allow under-qualified people to use your brand as a stepping stone for their own agenda. It’s going to make you look untrustworthy in the eyes of both your target audience and Google.

    6. Internally link between trusted content and trusted authors

    Once you’ve started accumulating trustworthy content written by expert writers, it’s time to interlace them together so that any authority earned is distributed.

    This is SEO 101 and dates back to the formation of the PageRank equation. Nothing has changed in terms of how necessary external and internal linking is.

    Cross-linking your tightly related content will lend authority and trust to those pages or articles. If you have a standout piece of content that everyone in your industry is linking to and citing as the best available on that topic, distribute that authority to related articles.

    Similar to celebrity endorsements, if your industry leader writes a detailed blog post and links out to other industry peers they admire and trust, you too would start paying more attention and credence to the words of those linked to experts.

    7. Expand (or retract) your content to fit the intent

    Before you or your writing team start to create a new piece of content, first consider what the purpose of the content is.

    Take a step back and think about the situations your target customers or clients are normally in before they start searching on Google. Understand that not every searcher is going through the same process and that one user may be much further along the AIDA funnel than another.

    Use empathy to understand these scenarios and then tailor your content ideas to match them. This may include creating long-form content with all the answers to every possible question your target user may be asking, but it also might mean creating a short shot of information in only 150 words. Don’t make the ingestion of information difficult for users, just because an “expert” SEO once told you that “every page should have 500 words” — that’s just not true. Write for the users’ intent, not for what SEO experts claim Google wants.

    After that, work on earning trust for that content by internally linking to it from closely-related onsite content, or from guest posts (again, closely-related to the subject matter) published on authoritative publications.

    8. Make it easy to use

    Putting time into improving and displaying your authority, expertise and trust will be a fruitless endeavour if the content is not easy for the user to find and read.

    Your content needs to be accessible and equally easy to understand and digest on any device, especially mobile devices. It also needs to be as free of adverts as you can make it (if you depend on advertising for income generation, then it may be time to consider a different method of deployment).

    Adapt your content to be understandable via different methods.

    Could your content also be communicated as a video on the page, suited to users who hate to read long articles, but love to learn and understand new things?

    Could you use imagery to quickly communicate the core and most important parts of your copy to users who fall into the “scanning” reader category?

    Would a voice recording at the top of the page make it easier for your target audience to engage with the content while they’re going about their day, maybe as they’re preparing their child’s lunch box for school?

    Again, empathise and decide how you want to communicate your message before starting to put together your first words and sentences.

    9. Make it easy for users to verify you

    In About pages, contact details, Privacy Policies — everything — make it easy for people to find information about you or your company. People want to vet and verify that they’re dealing with real people and not being taken for a ride or scammed in any way.

    Add more details to your About page about your history, who started the company and who runs it every day. Add a “Meet the Team” page so that people can get to know the person (or people) behind your business.

    Use testimonials throughout your website. Back up the quality of your services and customer service with photos of your happy customers. To go one better, create a short video on your phone of your customer showing how happy they are with your product or service.

    Don’t use stock images if you can help it. The more personal your images are, the more people can relate to you, get to know you and trust you. Most of us have digital cameras in our pockets these days that can take great photos and, after five minutes of googling How To guides, you can easily create attractive and iconic photos (here’s a good short guide for taking product photos for eCommerce stores).

    10. Build up your offsite authority

    You also need to have other people talking about how great you are. Asking your customers to leave reviews on your Google My Business profile not only builds up trust from your prospects, but it also helps you to rank well locally.

    Use third-party review sites too, such as TrustPilot, to make sure that there’s plenty of healthy sentiment for your brand when prospects are searching for “YOUR BUSINESS reviews”. You can encourage your customers to leave testimonials on and off your site purely by asking them after you’ve delivered your service or product. You’d be surprised how many people will do this if they’re asked and provided with links to the review sites you want them to vouch for you on.

    More importantly, work hard on gaining links and brand mentions from trusted associations. There will be many unique choices for your niche, but, in the case of medical websites, I’d suggest working with medical agencies, local authorities, medical employee associations, charities, and support groups. Do so authentically and you’ll see the benefits returned to you in the long term.

    Book cover of How To Get To The Top of Google 2021

    Get to the top of Google for free

    Download a free copy of our bestselling book,
    "How To Get To The Top of Google"

    Tips for Writers

    1. Develop a speciality topic to write about

    Develop a speciality. Be a master of one trade, rather than a jack of all.

    This doesn’t mean that you have to go back to school and earn the qualifications necessary to write about your chosen subject matter, but it does mean that, by selecting a primary subject to become an expert in, you can create the best content possible for your chosen publication.

    It’s mutually beneficial to both yourself and a publication when you’re a specialist in one subject. The editor gets expert content that they and their audience can trust, while you earn the trust of a publication that will continually come back to you for more of your expertly-written content.

    This doesn’t mean that you can’t write about different fields altogether, but specialising does make for better career opportunities (and financial benefits).

    2. Always ask for a linked to biography page, or at least a bio snippet

    Always ask for a stand-alone author page that includes your biography. If the publisher doesn’t understand the benefits of doing so, you can summarise this blog post and send it to them. Hopefully, in time, all publishers will be following the practice of utilising author pages on their websites, but, in the meantime, there may be a bit of education required. Think of it this way: not only will your potential publication be getting a great writer, but they’ll also be getting some additional SEO tips along the way. Everybody wins.

    If the publisher doesn’t have an author page for your biography, then, at a bare minimum, demand that there’s a small biography snippet somewhere on the content page you’ve created. This can either be at the beginning or the end, but request that it’s in there before you start creating your content. Some publishers may resist — small biographies like these have been abused by some SEOs in the past to insert backlinks — but persist with the request and explain the benefits if necessary.

    This should not be used as an excuse to write about some third-party client or website you’re trying to build links to. Use this only as a method of demonstrating your expertise, authority and trust in the form of your experience and qualifications.

    3. Make sure your biography includes your qualifications and experience

    If you’ve been given an author or biography page/snippet, optimise it.

    Be truthful about who you are, what your expertise is and how you’ve earned it, whether that be through work experience, education and earned qualifications, or another method of proof.

    List the university you attended or the night college you earned your certificates at. Are you writing about your expertise in yoga? Write about where you learned, who you learned from and the qualifications you may have earned while you were there.

    This can extend to any area of expertise. While I don’t have a degree in SEO, I do have several years of experience doing content writing, SEO, managing a team, and digital strategy, plus ten years of retail management prior to that (which really lends itself to understanding customer wants and needs). You can guarantee that’s what I’ll be sharing in my profiles from now on ?.

    4. Reference other sites where you’ve demonstrated your E-A-T

    Where possible, reference and link to other websites where you’ve established your authority. While this isn’t best done on the page of the content you’ve written (the context of outbound linking is very important), you can absolutely do this on your author page.

    You can link to other websites where you’ve been writing with established authority about your specialist subject. You can also link to your personal website too, as this can also help to establish that you’re a real person and not a pseudonym or robot auto-generating content.

    Link out to your social media channels too. Prove that you’re real. Prove to Google and the users who happen to navigate to your author page that you’re someone they can reach out to if they’d like to dive more into your deep intelligence.

    5. Make sure it’s linked to!

    There’s no point putting so much effort into your profile page and biography if it’s not linked to!

    While I do have a suspicion that Google could pick up on these pages and match the author name on the content and the biography page together, it’s always best to give Google a helping hand.

    Ask that the publisher or website owner you’re working with links to your profile from the content page if it’s not done so by default. It’s not complicated to do and, if the publisher isn’t aware of why they should do it, again, you can add extra value to your writing services by giving them a brief lesson on E-A-T.

    But Dale, I’m Getting Outranked by Thin Content!

    There are several hundred webmasters commenting on forums, SEO news websites and on social media that they’re being outranked by thin content — content that looks insignificant against the 1000 word piece they’ve put together. If I was in their shoes, I’d probably be feeling annoyed too, but I’d also start by asking myself some tough questions:

    • Is my content longer than the intent desires? Does it need to be 1000 words or would 50 suffice?
    • Did I source my content from external unqualified parties?
    • Was the content written for my biggest money pages merely sourced from other people’s pages, or maybe even taken from Wikipedia?
    • Is my author listed in the content? Do they have the experience to write about this?
    • Did I have my content peer reviewed by a specialist?
    • Could it be that my content just isn’t that good?

    If Google’s going to trust you to provide its users with the best quality customer service possible — and that’s what delivering content, services, or products is for Google — then it’s going to want to be able to establish that in your content.

    Even if Google isn’t using machine learning to match authors to content and is using a incredibly simple method of reviewing the authors and people attributed to writing or owning a website/business, it won’t matter how much money or time you’ve spent on it if Google can’t determine your expertise, authority and trustworthiness.

    How Long Will It Take to Regain Ranking Visibility?

    Unfortunately, visibility or ranking for priority keywords will not return overnight. As other SEOs have shown in their own work (including Marie Haynes and Glenn Gabe), some businesses have had to wait many months or for the next “broad core update” to arrive before they started to regain their lost positions.

    The best position to take is to do everything you can to make your website a shining example of what a well-optimised and authoritative website should be. This might require continual work on a daily or weekly basis, but the long-term benefits will result in improvements to your sales figures.

    Think of it like a savings fund. You pay into it regularly, so that, when the time comes, you reap the benefits of your sensible and prudent action.

    Start future-proofing your website by not only making it fully optimised for search engines (if you’re not sure, hire an SEO Ninja. We know a few ?), but also by making your content better than anyone else’s and tailored to showcase your own brilliance (E-A-T).

    And, if you’re still not sure how to optimise your website for E-A-T, please feel free to use our “How To Optimise for E-A-T Checklist”.

    Download as a Google Sheet
    Download as a Google Doc
    Download as a printable PDF
    Download as a printable Doc

    If you have any comments, feedback, or criticism (strongly welcomed; please let me know over on Twitter.

    Not Getting Enough Traffic?

    Not Converting Enough Leads?

    Get a free review of your marketing and website from our team of digital marketing experts, worth £197.

    Oh, did we say it was FREE?

    Menu