Atlantic Business Technologies, Inc.

Author: Tyler Slocum

  • The Real Story Behind Duplicate Content and Penalizations

    Duplicate content is a cardinal sin that will drive search engines to de-index your entire website. This is what many came to believe following a flurry of algorithm updates from May of 2015 to March of 2017. The truth is more complicated. Although content quality is a major factor in Google’s algorithm, many claims surrounding duplicate content penalizations are unwarranted and blatantly untrue. For that reason, I’d like to provide some best practices regarding duplicate content and how you can better optimize your content.

    What Types of Duplicates Can Be Penalized?

    Using duplicate content is inevitable, and almost every website has instances of it. According to Matt Cutts, Google’s algorithm expert, approximately 25–30% of content throughout the internet is duplicated from another web property.

    We constantly see eCommerce retail websites using manufacturers’ product descriptions, branded boilerplate content, and legal information. And in many situations, that’s required by the manufacturer—so it’s often unavoidable. Google knows that, and they’ve explicitly stated they will not penalize a website for duplicate content UNLESS the content is being used in a manipulative, intrusive, or misleading way.

    What does Google mean by manipulative or misleading? For example, some eCommerce companies create a number of microsites with entirely duplicated content in order to rank in multiple positions for the same search query. Another example of a misleading strategy is when a blogger aggregates content from other websites instead of creating his/her own content. These types of sites could easily be removed from Google’s index as a penalization, but most instances will go unnoticed.

    That said, there is always a caveat. While Google won’t necessarily penalize your website for non-egregious uses of duplicate content, they do encourage and reward unique content. Google groups all of the sites that have similar content into a cluster. From that cluster, Google’s SERPs display content from the URLs with the highest domain authority. The websites with lower domain authority won’t be removed from Google’s index, but it will be challenging for them to rank higher than the sites that created unique content.

    How to Address Duplicate Content

    There are countless reasons you might accidentally create duplicate content without manipulative intentions. For that reason, here is an outline of the most frequent scenarios with duplicates and how to resolve them.

    Problem #1 – Separate URLs exist for different variations of the same product.

    We often see this problem arise on large eCommerce or retail websites. For example, a used handbag retailer might create multiple pages for the same product if they have the item in different conditions (mint, used, poor, etc). Or, you might see a website that has the same product page listed within different categories, so the URL is technically duplicated within multiple subfolders.

    Solution – Canonical Tags

    If it makes sense from a design/UX standpoint to create multiple URLs for one product or one page, it’s paramount to tell Google this. Otherwise, you might not see any of the URLs rank well or even get indexed. In this scenario, it makes sense to utilize a canonical tag. A canonical tag tells Google there are multiple pages that feature the same content, but you only want one of the URLs to be indexed.

    A canonical tag looks like this, and should be placed directly in a page’s header:

    <link rel="canonical" href="https://usedhandbags.com/leather-bag/">

    Let’s apply this tag to the scenario above. The tag would be inserted into the header of all the duplicate pages, and it would tell Google that the href URL (http://usedhandbags.com/leather-bag/) is the only URL you want indexed. None of the other URLs will be included in Google’s search results, but you’ll at least be able to rank well for one URL, as opposed to none. If you encounter this situation, it’s also important to ensure the non-indexed URLs are easily accessible from the URL that is indexed.

    Problem #2 – There are multiple URLs for one page.

    This issue is one of the most common causes of duplicate content. After implementing an SSL tag or modifying permalinks, you might realize that there are a lot of new duplicates for one URL.  I’ve created an example below:

    http://www.example.com

    http://www.example.com

    http://example.com

    http://example.com

    Solution – 301 Redirects

    In this scenario, we see four duplicates of a website’s homepage. While canonical tags could prevent these URLs from being considered “duplicate”, it’d be more beneficial to use 301 redirects. Here’s a general rule of thumb: If you have the choice of doing a 301 redirect or setting a canonical, you should always do a redirect unless there is a technical reason not to do so. 301 redirects will pass the backlinks from one URL to another, while canonicals will simply de-index duplicate URLS without passing link juice. The only situations where a 301 wouldn’t be applicable would be temporary redirects, in which case you would want to utilize a 302 or 307. Or, as discussed in the previous scenario, a canonical tag can be used in a situation where a URL needs to remain on a website but you don’t want it to be indexed by Google.

    Problem #3 – Pagination is causing duplicate content.

    For those who aren’t familiar with pagination, it’s defined as “the process of separating print or digital content into discrete pages.” This is a scenario we often see throughout large eCommerce websites, specifically within category pages. If a category has too many items to fit into one page, product managers will often create a series of paginated URLs all with similar content and URL structure.

    As an example, take a look at Lowes.com. Here we see a series of paginated URLs that were created in order to feature all of their available bathroom faucets. If every faucet was included on one URL, it would be far too large. Each one of these URLs has the same exact meta title, description, & body content. As a result, Google could easily identify them as duplicates and choose to filter them out from search results.

    Solution – Use rel=”prev”/”next” tags

    If you want to avoid drops in your organic rankings, you’ll need to tell Google why these pages are so similar, and also how they relate to one another. This can be done with rel=”prev”/”next” tags, which tell Google this is a series of URLs, not duplicates.

    Let’s review an example of how to properly implement one of these tags. In the hypothetical URL, www.example.com/shoes/page2, we’re on page 2 of a series of paginated URLs. In order for Google to understand this is not a duplicate of the first page in the series, we’ll need to add the rel=”prev”/”next” tags to the URL’s header. The tags would look like this:

    <link rel="prev” href=”www.example.com/shoes/page1”/>
         *This tag tells Google which page came before the current URL
    <link rel="next” href=”www.example.com/shoes/page3”/>
         *This tag tells Google which page comes after the current URL

    After Google crawls this site, they will understand only one URL in the paginated string should be indexed. Without implementing the tags, Google would likely filter out every version of the URL and rank them at a lower position.

    Problem #4 – You’ve copied content from another site.

    Whether intentional or accidental, you might realize there’s content on your site that is copied directly from a manufacturer, competitor, informational website, or any other web property. This could be a lengthy quote, branded copy or even legal documentation. We’re only human, and sometimes a copy-and-paste is easier than writing your own unique content. Still, it’s poor practice and it will never help your organic presence to feature duplicated content.

    Solution – Write your own content.

    While it likely won’t cause your site to be removed from Google’s index, marketers and web managers must always view duplicated content as a last resort. If you really want to bump up your position within Google’s SERPs, take the time to write rich, digestible, and unique content.

    For any questions regarding duplicate content or canonical tags, feel free to reach out and contact us. If you’d like a higher-level view of web content strategy, continue learning with this post from our recent Thirst for Knowledge on Content Strategy for Large Organizations.

  • Know How to Win the Digital Ad Game Under Google’s New Rules

    2017 presented many challenges to the digital marketing world. Manipulative advertising, fake news, black-hat marketing, and bot traffic have littered the web. The good news is that Google responded swiftly. They implemented quite a few regulation changes. Both the pay-per-click and organic search platforms fell under this umbrella. But, it’s not only Google that reinforced their policies. Most major search engines and social media platforms have updated their regulations. The hope was that this would prevent manipulative content and obstructive advertisements.

    As marketers, it’s our job to adapt to these challenges. We must identify strategies that will maintain effectiveness in an ever-changing digital landscape. Google’s 2018 changes will affect marketing as we know it. Be ready and know how to avoid any negative impact on your ad campaigns.

    Google Introduces Stricter Ad Regulations in 2018

    In September of 2017, Google joined forces with the Coalition for Better Ads. Not long after, the search engine behemoth was ready for action. In February 2018, Google’s Chrome web browser had a new feature. A native ad blocker. It’s goal is simple. It will stop the display of any advertisement that features a non-compliant ad. Google has joined forces with many social media platforms to support this initiative. As a result, the impact could be drastic and far-reaching.

     

    woman browsing on laptop
    Google wants to change Chrome ads to make them better serve the user.

     

    Google’s strategy is to rid the internet of low-quality advertisements. This achievement would create a user-friendly browsing environment. While this is a noble cause, there is a great deal of risk involved. Online businesses and advertisers dread the regulatory change. This should come as no surprise. They stand to lose millions of dollars on wasted ad spend. It’s estimated that ad blockers contribute to a loss of billion dollars

    Google Chrome is the clear front runner as the world’s most used web browser. So, their introduction of a native ad blocker will cause that dollar amount to skyrocket. Here’s the added twist. Your advertisement could be compliant with Google’s standards. Yet, it will not show up on a website if there are any non-compliant ads there. And even if your ad gets blocked, you’ll still have to pay for the placement of that ad. Even worse, if your site features non-compliant ads, a warning could pop up. Your visitors would see this and sites will experience a significant drop in traffic.

    How Can Your Ads Avoid Negative Impacts from Google?

    With so many risks involved, your ad placement and quality both need careful planning. Google now scrutinizes your advertisements more than ever before. They are taking into account content quality, format, and animation. Google’s actions will create a domino effect. More websites and web browsers will introduce ad blockers. So, marketers will kiss more dollars good-bye, wasting them on blocked ad-placements. Also, the wrong ads could result in your entire website landing on the blacklist. You would no longer be able to show advertisements on Google Chrome.

     

    Google analytics page on laptop
    If you ignore Google’s new ad policies, you could end up wasting your advertising dollars.

     

    The stakes are high. If you’re feeling concerned right now, worry not. There is a way forward. Your overarching goal should be a user-friendly web environment. This means high quality content and non-manipulative advertisements across your site. To clarify, let’s discuss what Google considers to be “user-friendly“.

    What Makes Google Consider an Advertisement Bad?

    • Ads that Interrupt:  Low-quality pop-up ads are so named for one main reason. They block content. Most pop-ups appear after the content on a page has loaded. So, the user has a brief preview before a large window blocks the page. These hellish ads are among the most commonly cited annoyances of web users. As a result, they will be the first type of advertisement to receive a block.
    • Ads that Distract: We’ve all been here. You open a web page and begin reading a few lines of text. And then WHAM! A loud advertisement begins to play, causing you to jump our of your chair. In other cases, you’ll open a web page, and suddenly a large pop-up appears, flashing like a strobe light. Any sort of ad that has distracting audio or visual effects is cause for penalization.
    • Large Sticky Ads: Have you ever encountered a large, sticky advertisement? They usually lurk at the bottom or top of a webpage. These types of ads stay in the same position as you scroll through a website. To make them disappear, you have to click “exit”. Regardless of the position of these advertisements, they’re a high blocking risk.
    • High Density Mobile Advertisements: You’ve seen these sorts of ads. Most often while viewing full-page slideshows or lists. As you scroll through a slideshow, an advertisement will display between each slides. These ads? Blocked. If the advertisements on a website take up more than 30% of the vertical height of a page, they will not see the light of day.
    • Full Screen Advertisements: These advertisements often come accompanied by a 15 or 20 second countdown timer. Without a doubt, you have seen this before. We can all rejoice that Chrome will be blocking them out.

     

    What Makes Google Consider an Advertisement Good?

    • Immediate Ads: Users are much more likely to engage with ads that load fast. Accelerated Mobile Pages (AMP) have much higher click-through-rates. They also have more eCPMs (effective cost per thousand impressions).
    • Immersive Ads: The design of an advertisement should mesh well with the design of your website. Your goal should be to create advertisements that readers seamlessly interact with.
    • Relevant Ads: This needs to be a consideration for both website owners and marketers. Are you a marketer who utilizes Google’s display network? Then your advertisements need to display throughout websites within your niche. And if you’re a website owner, you need to track the ads on your website to make sure they are relevant to your content. If there is a lack of relevance between the ad and the website, there’s a good chance Chrome will block it.

     

    Man browsing web on tablet with coffee
    Follow these best practices and you will deliver advertising that Google and your customers will love.

     

    While the new regulations from Google have the potential to derail countless marketing efforts, they also present an opportunity to step back and consider your overall digital strategy. Your keys to success in 2018 should be focused around two main objectives: high-quality content and a friendly user experience. If you craft and place digital ads that are accurate and non-obtrusive, you’ll be able to avoid any wasted advertising dollars.

    Feel free to reach out to me and my team, to discuss your approach to marketing.  Be sure to read up on more of Google’s polices so you can be better prepared for the changes ahead.

     

  • Non-Secure Websites, Beware! Google is After You

    In July of this year, Google will take another step forward in their crusade to secure the internet. They will introduce a new feature on the 68th version of their Chrome browser. Its purpose? To warn users whenever they visit an HTTP website. A large “not secure” icon within the browser’s navigation bar will display. Google hopes to steer their users away from websites that don’t use a proper Transport Layer Security (TLS). This could create many challenges for web owners and designers.  Traffic and revenue losses, as well as drops in organic search rankings, could all be consequences. 

    Google’s Quest For Security

    Previously, Google only had non-secure warnings on pages that featured password input elements and credit card fields. This standard has now been dramatically modified with Google’s new warning system. By July, Google will require ALL websites to have their entire domain set up as HTTPS.

    This comes after several years of successful browser updates. Google was able to vastly increase the percentage of secure websites. In the last year alone, the number of protected websites on Chrome’s browser grew from 67% to 75%. Even more, 71 of the top 100 sites on the internet now use HTTPS by default. This is an increase from 37% one year ago. Google has surpassed all other browsers as the most used browsing platform. This means that Google’s policy update will have major implications on your site’s web performance.

     

    HTTPS sites guarantee a secure platform 

     

    What makes HTTPS different?

    Before stressing over the potential impact of this update, it’s important to recognize the countless benefits of establishing a secure connection via TLS. If your website is HTTP, as opposed to HTTPS, it means there is no active TLS. A TLS Certificate is a data file that binds a cryptographic key to all of a website’s details. In other words, this certificate creates an encrypted connection between a web server and your browser. This means that the connection between both points is unsusceptible of being hijacked or intercepted.

    When you load an HTTP website, someone else on the network can look at or modify the site before it gets to you. This can create a world of problems for both website owners and users alike. In a recent post on Google’s developer blog, Kayce Basques explains the potential damage that can occur on an unprotected website:

    “Intruders exploit unprotected communications to trick your users into giving up sensitive information or installing malware, or to insert their own advertisements into your resources. For example, some third parties inject advertisements into websites that potentially break user experiences and create security vulnerabilities.”

    In addition, if you submit sensitive information via a form or credit card field on an unprotected site, it can be intercepted before reaching the web server. This creates a number of threats, including identity theft, fraud, and invasion of privacy.

    What are the implications of Google’s update?

    Google is increasingly using security as an algorithmic ranking factor within their Search Engine Results Page (SERP). In 2014, Google publicly announced that websites would receive a boost in rankings if they switch from HTTP to HTTPS. And in-line with that policy, sites that remained HTTP would be at risk of losing rankings. This is a serious threat to the acquisition of organic traffic on HTTP websites.

    There is also an added risk of dropping conversion rates and losing customers. Studies show that  85% of web users would choose not to make purchases from a website if it was labeled as “non-secure”.

    If you’re concerned about the potential impact of this upcoming Chrome update, or the security of your site, contact the experts at Atlantic BT.

     

  • What Does SSL Mean and Why Should You Care?

    Most of us have spent enough time online to notice some websites have “http” at the beginning of their URLs, while others use “http”. However, many people don’t understand the difference between the two. To make things simple, the S in HTTPS stands for Secure, and what we call SSL is a “Secure Sockets Layer.” The term SSL is still widely used to describe a critical aspect of web security, though you should note SSL has become insecure and has been superseded by the more secure Transport Layer Security, or TLS. This kind of cryptographic protocol is not only essential to the security of your website, but also has a major impact on your organic visibility, SEO, and website performance.

    Why Does an SSL or TLS Matter?

    SSL/TLS is the foundation for secure browsing; it protects users from sharing their sensitive personal information. A TLS Certificate is a small data file that digitally binds a cryptographic key to a website’s details. In layman’s terms, this certificate creates a secure, encrypted connection between your browser and a web server. This secure connection means the encrypted information can ONLY be opened and seen by the user and the website—preventing the connection from being hijacked or intercepted.

    So why is this secure connection so important? Say, for example, a user visits an eCommerce website and they’re asked to submit personal information like their email address, mailing address, credit card information, or bank account number. What many people don’t realize is the information the user provides is passed from computer to computer before reaching its final destination. Without a TLS certificate, that sensitive information could potentially be acquired by any of the computers that it passes through. The TLS safeguards that personal information so it can only be seen at the final web server that the user sends information to.

    This process makes TLS vitally important to the security of your websites and users. Not only does TLS protect user information by encrypting the connection, it also verifies you are actually connected to the right server (rather than a server that intercepted your traffic). In case the security of your entire website wasn’t enough, SSL/TLS also has a significant effect on your SEO ranking.

    How Does an SSL or TLS Affect SEO Rankings?

    Google and the other major search engines have coined a term called “trust factors” or “trust seals” that signify a website’s identity is authentic, the site is legitimate, and the site is not susceptible to data breaches. These “trust factors” can include badges from the Better Business Bureau, detailed privacy policies, and most importantly, SSL/TLS certificates. Sites that utilize these trust factors are more likely to receive higher quality scores and hence, better SEO rankings.

    During the summer of 2014, Google explicitly stated websites would receive a ranking boost if they featured a TLS certificate. In addition, they provided the following “best practices” for getting started with TLS:

    • Decide the kind of certificate you need: single, multi-domain, or wildcard certificate.
    • Use 2048-bit key certificates.
    • Use relative URLs for resources that reside on the same secure domain.
    • Use protocol relative URLs for all other domains.
    • Check out Google’s Site move article for more guidelines on how to change your website’s address.
    • Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.

    While the amount of ranking boost provided by implementing TLS is still unclear, we do know that Google can severely penalize unencrypted sites.  And in January of 2017, Google announced any Chrome browser users would be warned before entering unencrypted websites. Since Chrome is the primary browser for over 55% of web traffic, this is something that should not be taken lightly.

    If you fail to include SSL/TLS on your site, not only can your site be demoted in search engine rankings, your users are less likely to make purchases or interact with your webpages. According to SSL.com, implementing SSL/TLS securely can lead to dramatic improvement in a website’s conversion rate. And in contrast, lacking SSL/TLS protection can have negative effects on conversion rate.

    Integrate TLS on Your Sites Immediately

    In conclusion, SSL/TLS certificates are vital to any website, but especially important to the performance of online stores or eCommerce sites. Without encrypting your website, you are leaving yourself susceptible to malicious attackers, data breaches, and lost customers. Unencrypted websites are much less likely to rank well in search engines, and even less likely convert users into customers.

    If you’re concerned about the security or performance of your website, contact ABT’s cybersecurity department, or continue reading the ten critical elements of a successful retail e-commerce website.