Atlantic Business Technologies, Inc.

Category: Development

Here, you’ll find insights on programming languages, frameworks, and techniques that shape the web and software landscape. Whether you’re a developer looking to refine your skills or simply curious about how things work behind the scenes, this space offers practical knowledge and thoughtful perspectives.

  • How to win more business with engaging long-form content.

    How to win more business with engaging long-form content.

    Writing content that converts is a balancing act. You want to write in a way that engages readers, but you also need to rank on Google so people can find your page in the first place. Both attracting users and keeping them engaged are important elements of the conversion process. 

    Our tips for boosting user satisfaction in lengthy content include:

    1. Use highly specific titles.
    2. Let headings protect readers from consuming irrelevant content.
    3. Incorporate an interactive table of contents.
    4. Get designers and developers involved in blog UX.
    5. Only include useful images.
    6. Optimize page load times.

    First of all, how does Google rank blog content?

    In many instances, Google uses ranking factors that aim to increase user satisfaction. For example, a low bounce rate and high time on page signify quality to search engines. 

    However, this is only a small piece of the pie when it comes to serving the best content to users. That’s because Google is also on a mission to rank the most authoritative and credible content. 

    Factors that signify factually correct information include:

    • Content length
    • Linking out to stats
    • Having authoritative websites link to your content

    SEO expert Neil Patel emphasizes the importance of content length in his blog post: How to Make Every Blog Article You Write Rank High in Google Search.

    And according to Hubspot’s article on the ideal blog length:

    “For SEO, the ideal blog post length should be 2,100-2,400 words, according to [our] data.”

    Here lies another challenge for content strategists. How do you keep readers engaged with a blog that is 2000+ words long? Will publishing lengthy, factual blogs harm user satisfaction and engagement?

    These guidelines will help you increase engagement with long-form content.

    When longer content is handled with care, you can make it digestible for readers and point them only to information that they need so they can skip the rest. 

    Here are some tips to harmonize lengthy content and user satisfaction:

    1. Use highly specific titles.

    This blog could have been titled something like:
    • “Writing better content in 2020.”
    • “The balancing act of readability and crawlability.”
    • “Tips for writing better lengthy blogs.”

    Each of these titles state the gist of the piece, but leave out important details. People reading these titles would have to dig through the blog to find out if it’s actually useful information and I’d be lucky if that’s what users were actually doing!

    Instead of choosing a title that is vague, a cute play on words, or a phrase stuffed with keywords, point to the actual focus of the article. Drawing a central focus to your content lets readers know they are in the right place. 

    Here are some of my favorite titles from Atlantic BT’s blog:

    While these titles tell you exactly what you are going to read about, words like “hack,” “game changer,” and “killing” are sure to peak a reader’s interest or create a sort of urgency.

    2. Let headings protect readers from consuming irrelevant content.

    I traditionally see headings as one- to four-word phrases used to break up content. In these situations, users are forced to read paragraphs following a heading to gain context.

    We don’t want to make readers do extra work! Instead, make headings as descriptive as your titles. This way, readers can truly sift through a long blog by jumping to the most useful sections. 

    3. Incorporate an interactive table of contents.

    Now that you have written descriptive headings, compile them in the beginning of your article. Users can click anchor links to jump to sections without scrolling. 

    I used an interactive table of contents in the beginning of this article. This tutorial walks you through some simple HTML to add one yourself.

    4. Get designers and developers involved in blog UX.

    One might argue that a blog is meant to have a simple layout that lets words speak for themselves. On the other hand, some design elements will facilitate reading rather than distract from the content.

    Atlantic BT recently did a blog upgrade that incorporated some of these elements. For example, at the top of blogs we use a progress bar. Hovering over the dots will tell you which section you can navigate to and clicking the dots will take you there. Try it out above!

    Switching our blog to use the Gutenberg editing experience also gave us some new flexibility with blocks. Our design and development teams worked to build callout features and different variants for displaying images.

    Here are a few examples of what we can do:

    5. Only include useful images.

    Throughout the content of a blog, you will often find random pictures used to help “break the blog up” or “provide something interesting to look at.” 

    Assume that any image that doesn’t add value is a distraction.

    Instead, incorporate graphs, charts, or screenshots of examples to support your point.

    6. Optimize page load times.

    Google studies have pointed to people exiting if load times exceed three seconds. In fact, 47% of consumers expect a page to load in two seconds or less. 

    Some of our top tips for increasing page speed include optimizing images and removing third-party scripts. However, the right CMS, hosting, and development team can identify and implement more technical factors that will significantly reduce load times.

    Because page speed is both a factor for organic rankings and user satisfaction, we cannot emphasize its importance enough!

    Putting these tips into practice facilitates content positioning.

    As the blog manager for Atlantic BT, I frequently rely on subject matter experts to write content. Sometimes we’re able to outline the pieces together. Other times, I’m handed a 1,000 word draft that reads as a stream of consciousness or a journal entry. And it’s totally okay! If we expected our technical experts to be content strategists, I wouldn’t get to be one.

    In many situations, I’m able to follow the above rules to frame the content and position it in a direct, consumable fashion that caters to the user groups we serve.

    Content strategy, design, and technology work together to support long-form content.

    The content you serve is crucial, but the way it’s presented can take it to the next level. For this reason, building thought leadership through well-structured articles goes far beyond copywriting. Your CMS, flexible design, and information architecture are all important elements in a business-winning content strategy.

  • How do I calculate the ROI of a new website?

    How do I calculate the ROI of a new website?

    If you’re contemplating changes to your website, it’s crucial to first understand the web design cost associated with making those changes. This understanding will allow you to calculate the potential return on investment (ROI), helping you determine whether the proposed changes will be beneficial to your bottom line.

    The average lifespan of a website is 3-5 years. After this period, aspects such as design and device compatibility may become obsolete. However, Atlantic BT has seen websites last much longer. That’s because these custom web projects include ongoing maintenance:

    • Framework updates
    • Updates to underlying software packages
    • Server hardware updates (or migrating to AWS with a well-architected framework and careful attention to workload-specific services)
    • Continued design and user experience tweaks based on data

    It takes a continued investment to maintain a reliable, secure, and performant website. In reality, these long-lasting websites have never really never stopped the development and improvement process. We like to implement new technologies and constantly experiment to stay current.

    Is experimenting with new technologies worth the investment?

    You are faced with two options: overhauling your website every 3-4 years or investing in continuous enhancements. Your decision ultimately depends on which option gives you the highest ROI. 

    Let’s say you are at year 2 and start to see the benefits in switching your platform. How should you calculate ROI for the investment? 

    Frame your mindset in three simple questions:

    1. How much does my website impact my revenue currently?

    It can be challenging to measure revenue if you aren’t directly selling products online. In businesses outside of eCommerce, it usually boils down to analytics and recognizing patterns in inbound leads. 

    Do 100% of your leads come in digitally? How does word-of-mouth impact your leads? Are you doing any other non-digital or traditional marketing?

    2. Is my website an important tool in my strategy going forward?

    What is my digital strategy moving forward, and how does my website play into it? Will you be driving ads to landing pages, hosting whitepapers, or creating a login portal for customers to self-serve?

    Take some time and determine how much your website will contribute to revenue growth.

    3. What are the opportunity costs of waiting?

    Now that you have determined how you will leverage your website, calculate the opportunity cost of going through your strategy with outdated technology aging web designs or a poor user experience.

    Would you benefit from going headless? What will happen if you skip Drupal 10? Will your WordPress site face security risks from skipping the next update? Do you have a way to draw insights without a custom dashboard? Are customers using a mobile device to access my website?

    Recognize hidden opportunity costs.

    Answering these questions is just the beginning of the journey as the need for change on digital platforms gains momentum. Here is a breakdown of opportunity costs we see companies missing when they wait too long to update.

    Next-level technology becomes more accessible to you and competitors every year. 

    We are seeing upticks in both velocity of framework changes, number of frameworks and languages available, and persistent threats to common web technologies, and groundbreaking Artificial intelligence as a service (AIaaS) platforms. This is all being powered by the underlying “law of accelerating returns” of technology.

    The power that we can harness from the underlying systems to run software, leverage machine learning, and fight off bad actors is growing alongside the speed of the processors, the speed of the networks, and the companies that are allowing us to consume this power in minute-by-minute cost structures. 

    The web design cost of rebuilding increases as technology becomes more outdated.

    The older a site gets, the older the technology used to build it is. A feature to be developed could be 25-100% more expensive to build. Over the total cost of ownership of a website, there will be a tipping point that rebuilding now is more cost effective than not. (Think how costly repair costs can be on very old cars or slightly old cell phones). 

    The cost of internal productivity.

    Your company is likely to interact with your website much more frequently than any individual customer/user. For this reason, the actual productivity of your employees is going to be negatively impacted along the age of the application. 

    If a rebuild can increase the performance, improve the usability, or automate some aspects of your site, you could potentially offset having to hire additional staff. 

    The cost of falling behind user expectations.

    The internet has greatly accelerated the pace of change in user behavior and expectations. As the chart below demonstrates, waiting five years between updates can lead to missing or being late to adapt to major user behavior changes. This is an excerpt from the  Forrester Research report, Winning the new B2B Buyer, 2020.

    The cost of not prioritizing experience. 

    With an increasing shift in the balance of customer interactions from personal to online, the online experience you provide to your customers is no longer a nice-to-have, but an essential part of the experience your customer has with your business.

    “Experience-Driven Businesses report driving faster topline growth, with an average revenue growth rate of 15%, compared to an average of 11% among other companies in our survey.”

    “Experience-driven businesses grew revenue 1.4x faster and increased customer lifetime value 1.6x more than other companies in the past year.”

    Forrester Research, The Business Impact Of Investing In Experience, April 2018

    Ready to get started?

    It is our opinion that you should not wait to get started on any digital  journey and you should always experiment with new technology. The web design cost ROI, if you choose wisely, will be measurable. Contact us for a free consultation to get started strategizing.

  • Coping with COVID-19 and preparing for the future of digital.

    Coping with COVID-19 and preparing for the future of digital.

    Take a look at how COVID-19 has impacted our organization, what to expect in a post-pandemic world, and how we are embracing the future.

    A new normal: working fully remote.

    I personally am settling into week 9 of isolation and social distancing, it’s been fun to watch other people adjust to becoming remote workers.

    In 2019, Atlantic BT decided that we needed to open our minds to people working full-time remote. This would help us recruit top talent across the East Coast and give employees flexibility they needed. This change also aligned well with my move to Ottawa, Canada, as I took the plunge into working fully remote for the first time.

    We encouraged the remaining ABT-ers in Raleigh to explore working from home a few days a week to avoid traffic and maximize their flow.

    What this means during this time of “Business as Unusual” is that Atlantic BT is already prepared to be fully operational as a remote team. We are still managing web applications, writing quality code, and settling into video conversations. Some of the benefits we are seeing include:

    • Better written documentation
    • More chatter that’s searchable and archivable
    • Finding creative ways to socialize (We are still having our 2 4 7 mini game breaks for some virtual games at home.)

    How will reopening impact our digital future?

    While we understand that these unusual times are putting a strain on many of you in all facets of life, I think it’s important to focus on the post-pandemic world and look to our digital futures. 

    At ABT, we are taking this time to streamline our processes to provide more value at a lower cost, develop more useful and value driven baselines for our favorite platforms (WordPress, Drupal, Magento, Shopify, React), and tackle enterprise integration as we never have before. 

    If there is one thing we can count on, it’s that we as a collective humanity will emerge from this as changed for better and worse. While popular media wants to emphasize the worst, I want to think about the better for a bit.

    Industrial Automation

    We will emerge a truly digital society where curbside pickup and take out persist outside of the restaurant industry as a way to do business. This will force us to rethink industrial automation as it is related to B2C eCommerce.

    Hopefully we can move a bit past the Gig Economy into the Robot Economy (I for one welcome our robot overlords, just wanted to get that out there before the Singularity).

    IoT and Healthcare

    We will be socially distancing for a bit and need to come up with new ways to take care of each other. In the digital world, we will see an explosion of IoT healthcare monitoring that will change the way we travel and we convalesce.

    This explosion will put more pressure on privacy and security frameworks to encrypt and make portable this data that ultimately belongs to the individual. We will be better off for sharing this information globally in a secure way.

    eCommerce transactions and shipping

    Ecommerce will continue to grow at a covid-19 like pace (too soon?) – but we need to be prepared. Current credit card and shipping solutions won’t cut it.

    As merchants see profound impacts to their ability to earn revenue, the credit card industry is going to have to ease costs and provide more security around transactions using biometrics, cryptocurrency, and temporary transaction numbers.

    Shippers are going to have to figure out machine learning and how it applies to density and logistics in a new way.

    We all see multiple delivery trucks pass our houses/apartments each day. We need to see true improvements in payment technology and logistics/supply chain improvements past what Amazon Prime has already taught us. 

    Increased work-from-home opportunities

    It’s clear that working from home will become more common. We are already in the habit and have become comfortable with these setups. New tools and processes will continue to improve, and companies are beginning to realize to the benefits of remote work. Furthermore, we may use work from home as a solution to prevent the spread of future illness.

    Reduced commuting will also allow our earth to heal a bit from the over consumption of fossil fuels and maybe actually put a dent in global warming.

    While it’s hard to look to these long-term, more pressing issues, we will be able to make the case for more remote work and better integrated digital platforms to allow for connected remoteness while keeping our communication secure.

    Let’s have a conversation.

    As ABT prepares to thrive in the future and comes out of this “Business as Unusual” time hardened and ready to tackle these problems with creative agility, let us know if there is anything we can do to support you. We are truly a people-first company and no digital ask is too big or small. If you need help, let me know!

  • Shared Google Authorization with an Angular site and .Net Core API.

    Shared Google Authorization with an Angular site and .Net Core API.

    There are many Angular tutorials for setting up websites using the Angular framework and .NET Core APIs. Likewise, there are many walkthroughs for integrating Google authentication with each. However, implementing these solutions separately yield the need to authenticate through Google twice, once for the angular site, once for the API.

    This article provides a solution that allows shared Google authorization through authentication on the angular site. To surpass the need to authenticate a second time, pass the token through a standard header to the API and use Google libraries to validate and authorize.

    Technology used in this Angular tutorial.

    This post assumes you’ve got the basic angular website and Web API projects running. This post will also likely be effective for any angular site 2+ or front end site where google authentication occurs. It should also work if your Web API project is Core 2+.

    The site I’m working with is designed to be exclusively authenticated through Google, however this method could be extended to handle multiple authentication formats (assuming there are .Net validation libraries for them or you write your own). Therefore, one other aspect to mention is that I am not storing any user data in a database.

    Using the Angular site, Google login, and local storage as a start.

    The primary goal is to make sure you have access to Google’s idToken after authentication. Using the angular-social-login default setup is pretty simple to get working. This is a pretty good article which also walks through setting up the Google App as part of this if you need. I can’t find the original post I followed, but this stackoverflow post shows storing the Google user/token in state for future calls.

    This code block (customauth.service.ts in the Angular site) just shows that on user subscription the user is stored in local storage:

      constructor(
        public authService: AuthService,
        public router: Router,
        public ngZone: NgZone // NgZone service to remove outside scope warning
      ) {
        // Setting logged in user in localstorage else null
        this.authService.authState.subscribe(user => {
          if (user) {
            this.userData = user;
            localStorage.setItem('user', JSON.stringify(this.userData));
            JSON.parse(localStorage.getItem('user'));
          } else {
            localStorage.setItem('user', null);
            JSON.parse(localStorage.getItem('user'));
          }
        });
      }
    
      // Sign in with Google
      GoogleAuth() {
        return this.authService.signIn(GoogleLoginProvider.PROVIDER_ID);
      }
    
      // Sign out
      SignOut() {
        return this.authService.signOut().then(() => {
          localStorage.removeItem('user');
          this.router.navigate(['/']);
        });
      }

    Options researched before finding the current solution.

    • The Microsoft standard way to handle google authentication. This is slick if you’re building an MVC site and need to allow Google auth, but I couldn’t find a way to allow sending over the token, as this generates and uses a cookie value with a Identity.External key.
    • JWT authorization is an option, but the tutorials got heavy quickly. Since I don’t need to store users or use Microsoft Identity, I blew past this.
    • A custom policy provider is another Microsft standard practice. There might be a better way to accomplish the solution using this approach, but I didn’t walk this path too far since I wasn’t using authentication through the .Net solution.

    The solution: a .Net Core custom authorize attribute.

    I used this stackoverflow post about custom auth attributes to hook up the solution. This is what allows the shared Google authorization using a standard authorization request header.

    Approach

    1. In Angular
      1. Build the Authorization header using the Google idToken.
      2. Pass the header for any authorize only API endpoints.
    2. In the web API
      1. Enable authorization
      2. Create a custom IAuthorizationFilter and TypeFilterAttribute
      3. Tag any controllers or endpoints with the custom attribute

    I provide code samples for these steps below.

    Angular API calls with an authorization header.

    The code in the api service (api.service.ts in Angular Site) grabs the id token from the user in local storage and passes it through the API call. If the user is logged out, this header isn’t passed.

    import { Injectable } from '@angular/core';
    import { HttpClient, HttpHeaders } from '@angular/common/http';
    import { SocialUser } from 'angularx-social-login';
    import { environment } from './../../environments/environment';
    
    export class ApiService {
      apiURL = environment.apiUrl;
      user: SocialUser;
      defaultHeaders: HttpHeaders;
    
      constructor(private httpClient: HttpClient) {
        this.user = JSON.parse(localStorage.getItem('user'));
        this.defaultHeaders = new HttpHeaders();
        this.defaultHeaders = this.defaultHeaders.append('Content-Type', 'application/json');
        if (this.user != null) {
          this.defaultHeaders = this.defaultHeaders.append('Authorization', 'Bearer ' + this.user.idToken);
        }
      }
    
      public getAccounts() {
        const accounts = this.httpClient.get<Account[]>(`${this.apiURL}/accounts`, { headers: this.defaultHeaders });
        return accounts;
      }
    }

    Enabling authorization in the .Net Core project.

    In the StartUp file (StartUp.cs in the API project), authorization has to be enabled.

    public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
    {
      ...
      app.UseRouting();
      app.UseAuthorization();
      app.UseEndpoints(endpoints =>
      {
          endpoints.MapControllers();
      });
    }

    The custom filter attribute to validate without another authorization.

    This creates the attribute used for authorization and performs a Google validation on the token.

    This application is used only for our Google G Suite users, and thus the “HostedDomain” option of the ValidationSettings is set. This isn’t necessary, and I believe can just be removed if you allow any Google user to authenticate.

    I’ve named this file GoogleAuthorizationFilter.cs in the API project.

    using Google.Apis.Auth;
    using Microsoft.AspNetCore.Mvc;
    using Microsoft.AspNetCore.Mvc.Filters;
    using System;
    
    namespace YourNamespace.API.Attributes
    {
        /// <summary>
        /// Custom Google Authentication authorize attribute which validates the bearer token.
        /// </summary>
        public class GoogleAuthorizeAttribute : TypeFilterAttribute
        {
            public GoogleAuthorizeAttribute() : base(typeof(GoogleAuthorizeFilter)) { }
        }
    
    
        public class GoogleAuthorizeFilter : IAuthorizationFilter
        {
    
            public GoogleAuthorizeFilter()
            {
            }
    
            public void OnAuthorization(AuthorizationFilterContext context)
            {
                try
                {
                    // Verify Authorization header exists
                    var headers = context.HttpContext.Request.Headers;
                    if (!headers.ContainsKey("Authorization"))
                    {
                        context.Result = new ForbidResult();
                    }
                    var authHeader = headers["Authorization"].ToString();
    
                    // Verify authorization header starts with bearer and has a token
                    if (!authHeader.StartsWith("Bearer ") && authHeader.Length > 7)
                    {
                        context.Result = new ForbidResult();
                    }
    
                    // Grab the token and verify through google. If verification fails, and exception will be thrown.
                    var token = authHeader.Remove(0, 7);
                    var validated = GoogleJsonWebSignature.ValidateAsync(token, new GoogleJsonWebSignature.ValidationSettings()
                    {
                        HostedDomain = "yourdomain.com",
                    }).Result;
                }
                catch (Exception)
                {
                    context.Result = new ForbidResult();
                }
            }
        }
    }
    

    Putting the custom attribute in place.

    This is just a snippet of code, as on your controllers you just have to add the one line of code (well, two including the using statement). If the GoogleAuthorize doesn’t validate, the call returns as access denied.

    using YourNamespace.API.Attributes;
    
    [GoogleAuthorize]
    [ApiController]
    public class AccountsController : BaseController {
    

    Voila! No need for a second authentication.

    The .Net API is now locked down only to requests originating from a site with Google authentication. The custom attribute can be extended for additional authentication sources or any other desired restrictions using the request. I like the simplicity of a site which allows Google auth only, but it wouldn’t be a stretch to add others – and I really like not managing any users or passwords. I hope this Angular tutorial for shared Google authentication works well for you too!

  • The ins and outs of a complex content strategy.

    The ins and outs of a complex content strategy.

    Content strategy involves the planning and creation of copy for your business.

    The principles of a basic content strategy for your website include:

    • Deciding who your customers are and what they want to read
    • Performing keyword research
    • Crafting copy or a content plan within these guidelines

    When is a basic content strategy not enough?

    If your website gets hundreds of thousands of visitors a month, the decisions you make about content could drastically impact your business. For this reason, content strategies for larger websites go beyond basic copy. They also incorporate:

    • Detailed persona research and testing
    • Categorizing information (Information Architecture)
    • Navigation and design
    • Selection of a Content Management System (CMS) to support user level access, page creation, and approval workflows

    For example, Atlantic BT has been faced with the following scenarios:

    • When redesigning a website for the Department of Revenue, how do you label information so people can easily find the tax forms that pertain to their situation?
    • When designing a university website, how do you point prospective students to information when they are at different phases of the decision and enrollment process?
    • When designing an eCommerce website, how do you categorize thousands of products and incorporate search?
    • When creating 1 page of content takes 3 weeks, how do you simplify the CMS experience to reduce turnaround?

    Atlantic BT has developed a proven process for Content Strategy.

    An all-encompassing content strategy will include persona and market research, information architecture, user testing, and supporting technologies. Take a look at the steps and considerations involved.

    Defining personas.

    Persona research varies on a client by client basis. Personas may include both internal users of the website, or external users (customers and clients).
    Through conducting interviews, market research, and web behavior analysis; Atlantic BT creates detailed profiles that describe a member of the audience segment, their preferences, their perspective, their background, and what influences them.

    Creating a content plan and information architecture.

    Atlantic BT develops a comprehensive Content Strategy using a structured sequence of research, workshops, and strategy development methods. Websites of all sizes benefit from going through these steps.

    • Content inventory: We carefully examine a website to locate and identify existing content.
    • Content audit: We take this content and evaluate its usefulness, accuracy, tone of voice, and overall effectiveness. We use in-house tools and Google Analytics to score web content.
    • Content analytics: We review content ranking and keyword usage within analytic tools.
    • Audience mapping: This includes the mapping of content to different audiences and use cases.
    • Information grouping: We define user-centered topics and relationships between content. This could include grouping content by service categories or the persona it serves.
    • Card sorts: Card sorts can be conducted as live workshops or online. This testing method identifies the way users understand and group the content being presented to them.
    • Taxonomy development: We create a definition of a standardized naming convention (controlled vocabulary) to apply to site content.
    • Descriptive information creation: We define useful metadata that can be utilized to generate “Related Link” lists or other navigation components that aid discovery. This could include tagging eCommerce products into categories or tagging articles by topic.
    • Governance: We define the desired editorial workflow and build the role/auth model to fit. Your Content Management System will facilitate this workflow.

    Your Content Management System plays an important role.

    Choosing technologies to support your content strategy includes selecting the right CMS, or customizing an existing CMS, to meet needs. Some considerations when choosing a CMS include:

    • Will you need different levels of user permissions?
    • What will the publishing workflow be?
    • How much would you like users to be able to customize pages?

    Implementing the right CMS can simplify the content publication process and create brand consistencies across templates.

    A Content Strategy is only useful with adequate training.

    Employees need to be able to use a system with ease publish content frequently. First of all, we provide guidance for choosing the types of things to write about and where to publish them on your website. Secondly, we provide training for using your CMS to the fullest. This could include drafting content or designing new pages.

    Ready to take a deeper dive into content strategy?

    If you’re interested in learning more about user behavior on your website, how to effectively structure your content, or selecting the right tools to support your goals; we’re happy to help you get started. Contact us for a free consultation.

  • Slow page speed is killing your business. Here’s how to fix it.

    Slow page speed is killing your business. Here’s how to fix it.

    Page speed is a critical element of a revenue-generating website, but it’s sometimes overlooked and can be a challenge to fix. In this post, gain an understanding of why it’s worth the investment to improve page speed as we breakdown 6 key factors to share with your development team that will get you closer to improving your page load.

    The 6 key factors to improving page load are:

    1. Reduce the number of third-party scripts
    2. Caching and minification
    3. Optimize images
    4. Lazy load images
    5. Serve content from a Content Delivery Network (CDN)
    6. Remedy redirect chains

    How important is page speed?

    As a general rule of thumb: the faster your website loads, the better the user experience. But do you know how many opportunities you are missing out on with slow page load time?

    Load time exceeding 2 seconds increases the exit rate.

    First of all, Google ran a study and found that 53% of mobile ad clicks immediately exited when load times exceeded 3 seconds. And people’s standards for speed have only increased over the years. In fact, 47% of consumers expect a page to load in 2 seconds or less.

    Faster page speed means higher rankings.

    Google takes user preferences into account with its ranking algorithm. Therefore, they favor pages with fast performance.

    Google’s Webmaster blog confirmed the importance of speed in a July 2018 announcement:

    “Today we’re announcing that starting in July 2018, page speed will be a ranking factor for mobile searches… We encourage developers to think broadly about how performance affects a user’s experience of their page and to consider a variety of user experience metrics.”

    Furthermore, recent studies by Hubspot, Backlinko, and Optinmonster each listed site speed as a top-ranking factor for Google (both mobile and desktop).

    What happened when Atlantic BT tested page speed factors?

    When we noticed a decrease in organic performance on Atlantic BT’s own website, we decided to perform a technical SEO audit to diagnose the issue. We found that our page speed was not up to par using Lighthouse and GTMetrix, free tools that provide detailed reports on your site performance. Between these two tools, our Front-End Development team compiled a list of changes to make to our website.

    We tested speed with and without these factors, isolating variables to determine the most effective items and their impact on speed scores.

    Atlantic BT’s Page Speed Audit Checklist:

    We found a handful of items to be the most impactful on page speed. Take a look at the top 6 factors to increase website performance, and be sure to share with your development team!

    1. Reduce the number of third-party scripts:

    We found third-party scripts to have the largest effect on slow load times. These scripts include HotJar, Google Tag Manager, the Facebook pixel, and any other JavaScript you use to track ad campaigns and web-behavior.

    If a third-party script doesn’t add clear value to your site, remove it. For example, pause Google Tag Manager scripts that aren’t being actively used to track campaigns. Similarly, tools like Hotjar can often provide useful data with limited use – run it long enough to gather the data you need, then be sure to disable it. Then, optimize the loading process for the scripts you decide to keep.

    2. Caching and minification:

    There are many tools and services for JavaScript and CSS file compression such as Uglify JS, YUI Compressor, Minify, and Node-Minify.

    Likewise, you can use a service for caching. Depending on how your website is built, you can try some of these options:

    • WordPress: Atlantic BT uses W3 Total Cache plugin on our WordPress website.
    • Drupal: Visit their wiki to find a partial list of the top used modules for improving performance and scalability.
    • Magento: Varnish is integrated into Magento 2.x by default and only requires a few configuration changes to get started.
    • .NET frameworks: The .NET framework offers various classes for caching along with custom classes to extend caching.

    You can use a JavaScipt build system like Webpack to compile and compress your site’s custom JavaScript and CSS. Some other highly used JavaScript build tools/task runners are Gulp and Grunt. All of these can be used in any CMS or non-CMS site.

    3. Optimize images:

    If your website is on WordPress, you have access to several plugins that will easily optimize images. Atlantic BT currently uses EWWW Image Optimizer. We can bulk optimize and auto-generate .webp images for all custom WordPress image sizes. We then serve .webp images with a fallback to a png or jpg format.

    If you aren’t in WordPress, this free API is supported by many CMS, platforms, and tools.

    4. Lazy load images:

    To improve page load time, try lazy loading images. The latest chrome supports native lazy loading via an attribute in, for example: <img src=”“path/to/img/img.jpg”” />.

    While Atlantic BT uses Vanilla Lazyload NPM for our website, Google’s developer blog also provides solutions for lazy loading.

    5. Serve content from a Content Delivery Network (CDN):

    With a CDN, instead of having a single server handle traffic, bandwidth spans across multiple servers. Serving content from a CDN will both create a faster experience for users and prevent downtime during traffic spikes. Many of Atlantic BT’s client websites use Amazon Cloudfront as their CDN.

    6. Remedy redirect chains:

    A redirect chain is when a single webpage is redirected repeatedly. These can creep up after many 301 redirect rules are written in the .htaccess file over time, especially between several people or teams.

    You can scan your site with a tool like MOZ to see what pages have multiple redirects. Then, clean up rules in the htaccess file and/or reduce the number of redirect rules by using regex where applicable.

    How page speed impacted Atlantic BT’s website

    Atlantic BT found that implementing these changes was well worth the effort! After deploying page speed updates in early October, we quickly noticed an uptick in organic traffic.

    From September to November, organic traffic increased by 18%. Better still, our website conversion rate increased by 60%.

    Even after deploying these changes, we continue to look for ways to improve our site speed and enhance UX.

    Need help increasing page speed?

    Atlantic BT is happy to share page speed tips with you. We offer page speed audits and can implement any recommendations to get your website back on track. Reach out if you’re interested in learning more about our web development solutions.

    [general_cta subtitle=”Ready to get started?” title=”Get in Touch for a Free Consultation” button_text=”Contact Us” url=”/contact/” type=”button button–primary”]