Atlantic Business Technologies, Inc.

Author: Randy Earl

  • Machine Learning as a Service: It doesn’t have to be complicated.

    As I was watching the AWS re:Invent 2019 keynote addresses and product releases, I was struck by a realization, namely, that machine learning isn’t some science fiction future yet to come – it’s already here, if you know where to look and how to use it.

    Machine Learning is increasingly available, but some approaches are easier than others.

    Our clients are starting to ask more and more about implementing machine learning into the solutions we provide for them. They do this because they have heard of the increasing ability of machine learning to enable automation of tasks that, until recently, could only be performed by human intelligence. The cost and time required for humans to perform these tasks meant they were often too expensive or couldn’t be offered in real-time – for example, document translation services.

    What you may not know is that many services leveraging machine learning (ML for short) are already available. For example, Amazon Web Services (AWS) is continually developing and expanding a broad range of technology services – we watch their annual re:Invent conferences very carefully to learn more about their new offerings. In fact, AWS re:Invent 2019 introduced or expanded twenty ML based services!

    We categorize ML solutions into two models.

    I like to think of these services in two broad categories: “Ready-to-Use” and “Build-Your-Own” models. Why do I make this distinction? It comes down to what machine learning involves.

    Think about what “learning” entails for a human: years of experience, from crawling to graduate school; feedback in forms ranging from trial-and-error to peer review; and the sheer repetition involved to internalize what we learn.

    The process with machines is fundamentally the same. It takes large amounts of raw data, intense processing, and guidance to develop the algorithms. For humans, this takes years of full-time processing by the human brain. For machines, the effort required is comparable – developing effective machine learning is no small task!

    For this reason, the ready-to-use models are the ones that excite me the most. In these cases, the data gathering, algorithm development, and validation have all been done for you.

    Think of all the login captcha images you’ve identified over the years. You were “training” a machine learning algorithm.

    Which Machine Learning services are easy to implement?

    Being an AWS Certified Partner, we use many of the ML enabled services from Amazon Web Services. These are just a few:

    • Comprehend – topic, sentiment, and relationship analysis of text.
    • Transcribe – automatically convert speech to text.
    • Translate – natural and accurate language translation.
    • Polly – turn text into lifelike speech.

    As you can see from these examples, these are broadly applicable services that could be developed from widely available data sources and input for training the models. Being broadly applicable, there’s a good chance one of these could be useful for your business. Fortunately, these services are ready to use and integrate in your applications.

    If you have a very specific task for a limited use case, you will likely need to use the Build-Your-Own model. As with building anything, you need the appropriate tools and techniques. Amazon Sagemaker is a tool designed for just that purpose. Frankly, building your own ML model is a complex topic beyond the scope of this post.

    If you would like to learn more about how to leverage the Ready-to-Use services, watch for my next two posts in this series revolving around these topics:

    Ready to learn more?

    If you’re interested in learning more about how you can apply machine learning, reach out for a consultation to get started.

  • A Look Inside Atlantic BT’s DevOps Process

    In order to deliver robust solutions to clients; code must be robust, reliable, scalable, maintainable, and secure. This level of quality only be achieved through building a solid software development process throughout the Software Development Life Cycle (SDLC).

    The Benefits of DevOps Methodology

    DevOps is the combination of cultural philosophies, practices, and tools that increases an organization’s ability to deliver applications and services at high velocity.

    Atlantic BT adopted DevOps methodology because we saw the following benefits, both tangible and intangible, to our ability to deliver quality solutions to our clients:

    Tangible Benefits

    • Shorter development cycle
    • Increased release velocity
    • Improved defect detection
    • Reduced deployment failures and rollbacks
    • Reduced time to recover upon failure

    Intangible Benefits

    • Increased communication and collaboration
    • Improved ability to research and innovate
    • Promotion of a performance-oriented culture

    How will working with a DevOps partner benefit me?

    You can benefit from partnering with a company that follows DevOps practices in the following ways:

    • Faster delivery of features
    • More stable operating environments
    • More time available to add value (rather than fix/maintain existing features)

    DevOps Process Chain

    Because DevOps is a cultural shift and collaboration between development, operations, and testing; DevOps focuses on process and approach.

    Atlantic BT takes the following steps in our DevOps process for software development and delivery:

    • Code – Conduct code development and review, version control tools, and code merging
    • Build – Implement continuous integration tools and build status
    • Test – Test results to measure performance
    • Package – Create artifact repository and application pre-deployment staging
    • Release (Deploy) – Set up change management, release approvals and release automation
    • Configure – Implement infrastructure configuration and management, as well as Infrastructure as Code tools
    • Telemetry – Implement application performance monitoring and end user experience measurements

    Elements of Atlantic BT’s DevOps Process

    Automation with Jenkins

    Because automation is an important part of DevOps, your tool set is essential. Atlantic BT’s primary Continuous Integration (CI) tool is Jenkins automation server. Jenkins is an extensible, cross-platform, continuous integration and delivery automation server for open source projects.

    Jenkins supports version control systems like Git, making it easier for developers to integrate changes to the project and for users to obtain a fresh build. It also allows us to define build pipelines and integrate with other testing and deployment technologies.

    Automated Testing

    We have a dedicated QA department and include QA time as part of the development plan as a best practice. As a minimum baseline, we evaluate the platform using unit and functional testing.

    Our Continuous Integration tools perform the following key test elements:

    • Unit Test validation
    • Integration Test validation
    • Code analysis
    • Functional Tests

    Once sections of an application have been QA’d through unit and functional tests, automated tests can be developed for ongoing quality assurance.

    Infrastructure-as-Code Approach

    ABT optimizes cloud architecture for maximum reliability and scalability while maintaining security. We take an infrastructure-as-code approach, scripting all instance builds so they can be automated—and thus reliably replicated—in the production process.

    The ability to reliably configure and stand up server instances is critical, as most complex projects require many servers of different configurations at different stages of the project to accommodate development, testing, migration, and production needs. This approach also facilitates Disaster Recovery planning and implementation.

    Monitoring, Metrics, and Alerting

    Understanding the importance of metrics, we maintain a fully-staffed NOC that monitors key performance parameters and alerts 24/7/365. We take responsibility for monitoring application and infrastructure health, including:

    • Application availability and response time
    • CPU, Memory and Disk
    • Throughput
    • Http response codes
    • DB Connections

    Metrics for applications hosted on Amazon are collected in AWS Cloudwatch; others are determined as appropriate by hosting method.

    DevOps and AWS

    Atlantic BT’s AWS partnership enables us to fully tap into their set of flexible services, which are designed to empower companies to deliver products using DevOps practices. These services simplify provisioning and managing infrastructure, deploying application code, automating software release processes, and monitoring application and infrastructure performance.

    AWS Command Line Interface

    As part of using the AWS console, advanced website developers can manage their websites via command line management tools like the AWS Command Line Interface (CLI). CLI is a unified tool to manage AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts.

    The AWS CLI has over 140 simple file commands for making efficient calls to AWS services.

    CI/CD Pipeline on AWS

    CI/CD Pipeline on AWS allows you to automate your software delivery process, such as initiating automatic builds and deploying to Amazon EC2 instances. AWS CodePipeline will build, test, and deploy your code every time there is a code change. Use this tool to orchestrate each step in your release process.

    Other Amazon Tools

    Other Amazon tools we use include:

    • Amazon API Gateway: a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale.
    • AWS CloudTrail: a web service that records AWS API calls for your account and delivers log files to you.
    • AWS CodePipeline: a service that builds, tests, and deploys your code every time there is a code change, based on the release process models you define.
    • AWS Identity Access Management: manages access, where you can specify which user can perform which action on a pipeline
    • Amazon CloudFront Reports and Analytics: offers a variety of solutions including detailed cache statistics reports, monitoring your CloudFront usage, getting a list of popular objects, and setting near real-time alarms on operational metrics.

    Start Implementing DevOps Today

    Ultimately, implementing DevOps evolves products faster than organizations using traditional software development and infrastructure management processes. This speed enables organizations to better serve their customers and compete more effectively in the market.

    If you’re interested in getting help implementing DevOps or looking for a software development partner that follows best practices, contact us to learn more.

  • Leveraging NLP for Better Survey Data & Customer Satisfaction

    How often have you found yourself frustrated when answering a survey? Perhaps you were not presented with an option that covered your case or enabled you to raise your concern. Maybe you wished for a place to provide more detailed information.

    In either of these situations, that firm could not get useful information to improve your experience with them.

    Why Should I Include More Open-Ended Survey Questions?

    While multiple choice responses are straight-forward to analyze with clear trends in responses, it only leaves room for answers to questions that the survey writer anticipated. This is okay for some questions, such as yes or no, how many times, Likert ratings, or questions with only a few possible responses.

    For other questions, like “how do you feel about our product?”, it’s nearly impossible to anticipate any adjective a person would want to use.

    Furthermore, with multiple choice for such a question you are limiting responses in a way that manipulates data. You could lead the survey taker into submitting a misleading response by forcing their selection into predetermined categories.

    Multiple choice questions can help you identify a problem, but they rarely provide enough insight to help you solve the problem.

    Open-ended questions allow respondents to provide answers in their own words, focusing on what is important to them. With no restrictions on their response, you can identify new issues that you would not have thought to include in your questions.

    In addition, this kind of open text feedback will often contain information about context (in which circumstances an event occurred) and additional detail (exactly what happened).

    The Challenge With Open-Ended Question Analysis

    While open-ended questions can provide a wealth of meaningful information, it takes a great deal of time to analyze them properly. In fact, User Researcher and founding partner of Adaptive Path Indi Young, plans for 8 to 10 hours of analysis time for every hour of recorded interviews or text read at natural speed. We have found this estimate to be realistic.

    Why does it take so long? It takes time because you don’t know what you are looking for – you will know the valuable nuggets when you see them, but only analyzing all the data will provide the patterns to reveal them. To do this, you have to:

    • Go through every word in the responses
    • Identify the topics that are mentioned
    • Identify the labels people are using to distinguish those topics
    • Map different labels people use for the same things
    • Repeat the process for adjectives and modifiers
    • Identify how they feel about these topics, positive, negative, or neutral
    • Discern contexts that clarify the meanings
    • Extract relevant details that can be used in developing solutions

    This process may seem like overkill – if you have a dozen or two short responses most people can read through them and take away one or two key points. However, if you have hundreds of responses, or the respondent can go into detail and provide longer answers, then you rapidly obtain more information than can be usefully processed merely by reading through them.

    A structured analysis, aggregating the detailed responses from many participants, can reveal insights that might easily be missed in small samples. However, few firms have the resources to provide that kind of analysis on hundreds or even thousands of responses.

    When to Incorporate Natural Language Processing for Surveys

    Fortunately, machine learning-enabled algorithms have developed to the point where much of this analysis can be automated. The process is called Natural Language Processing, or NLP for short. While it can’t do everything listed above, NLP can be of great assistance in two major areas: 1) Topic Analysis (what people are talking about), and 2) Sentiment Analysis (how they feel about those topics).

    Using NLP to perform that preliminary work of topic and sentiment analysis can give the research team a great head start and allow them to instead focus on what human experts do best – assimilate those results and then look at the contextual information and details to glean valuable insights. Furthermore, it reduces human error and bias.

    A Real-World Example With Amazon Comprehend

    During the Discovery phase of projects, Atlantic BT frequently uses surveys to conduct user research. Recently, we needed to analyze responses in a survey performed as a part of brand research for a pharmacy school.

    In this instance, Atlantic BT was working with 800 responses from hundreds of participants. At an average of one minute per response, simply reading through all these would take 13.5 hours, or two full days. And that’s before performing any analysis – remember the point above about proper analysis taking 8 to 10 times longer? That would mean that a fully manual analysis of that content would take three weeks!

    Instead, we chose to use Natural Language Processing to perform the basic topic and sentiment analysis, which allowed our research team to rapidly identify key areas to focus on and research more fully. We chose Amazon Comprehend as the NLP tool to use.

    Why We Chose Amazon Comprehend

    Amazon Comprehend is a service that uses machine learning to draw insights from text. You could use this tool to identify positive or negative connotation or to pick out specific phrases within responses. According to Amazon, full capabilities include:

    • Identifying the language of text
    • Extracting key phrases, places, people, brands, or events
    • Understanding how positive or negative text is
    • Analyzes text using tokenization and parts of speech
    • Automatically organizes a collection of text files by topic
    • Building custom sets of entities or text classification models that are unique to your organization

    As Atlantic BT is an Amazon partner, we find that Amazon Comprehend is compatible with our other toolsets, is continually being improved, and is very cost effective.

    What We Learned Through Natural Language Processing Analysis

    Once the full analysis was complete, Atlantic BT’s user research team was able to draw conclusions that helped drive a website redesign and content strategy.

    Eight major topics were identified as reasons for wanting to attend this pharmacy school. Further research, such as cross-validating these insights with other sources such as search terms, Reddit and other methods, enabled us to refine our insights around these topics. Understanding the motivation behind prospective students in selecting a school and program is critical to boosting the conversion rate of these low-volume, high-value transactions of both applying to a school and finally selecting that school from those that approved their application.

    Just a few examples of the insights gained include:

    • Deep Motivations: While things such as national rankings are of obvious importance, we learned more about how motivations and decisions were shaped by a key influencer in the applicant’s life; the stories related in the responses were extremely helpful in identifying content topics which would resonate with and reinforce those motivations. These factors often influence decisions around programs and schools to which they will apply.
    • Natural Environment: While not necessarily something one would think about in selecting a pharmacy school, the comments made it clear that proximity to a lake and other outdoor activities was a differentiator for many applicants. Factors like this can make a large difference in turning an offer into an acceptance – which is very important when most applicants have been accepted by multiple schools.
    • Multiple Value Propositions: Students must now make a complex return on investment calculation when considering their career options against student debt. Things such as dual-degree programs could save a year of education, a variety of programs can offer opportunities to improve specialization in the field of pharmacy and thus expand career opportunities. Responses identified these and more as important decision points.

    These types of themes were leveraged to create engaging content, matching the needs and motivations of prospective students towards the end goal of increasing quality applications and acceptance into the pharmacy school.

    Need Help Conducting User Research?

    Atlantic BT is well-versed in user research; conducting user and stakeholder surveys is just one phase of our Discovery process. Contact us to learn more about our UX Research and Design services.

  • 4 Simple Tips for a Successful Lean UX Canvas

    While visuals are a prominent piece of website interaction, user experience design goes far beyond aesthetics. UX professionals go deeper, seeking to create experiences that meet user needs and connect their interactions with business goals.

    Ideally, this process begins with user research. During this phase, a user researcher and their team will dig into the real-life tasks and pain points that people experience. After gaining a thorough understanding of user motivations and behaviors, the team will then design products to help meet business goals while solving real-life problems of the target market.

    When resources are limited, rely on a Lean UX Canvas.

    Not every project can follow an in-depth user research process. Common reasons for trimming it down include lack of budget, difficulty accessing representative users, or a limited project timeframe.

    When resources are limited, you shouldn’t throw UX principles out the window. Instead, we recommend a workshop technique called the Lean UX Canvas. The canvas was created by Jeff Gotthelf, and it just so happens that he recently released the Lean UX Canvas V2.

    How does a Lean UX Canvas work?

    First, a Lean UX Canvas is used to help teams recognize their core business problems and break them down into key assumptions. Then, these assumptions can be reworked into hypotheses for future testing.

    A Lean UX Canvas creates a foundation for running useful tests that reduce risk and drive smarter UX decisions.

    Tips for a successful Lean UX Canvas:

    A skilled facilitator can guide you through the canvas pretty quickly. Participants and stakeholders should keep the following principles in mind:

    • You are not your user; their perspectives are often different.
    • Don’t get too far ahead of yourself. This isn’t visual design, but if you get this part wrong the visual design won’t matter.
    • Be careful with your assumptions! You have to make them, but keep yourself as grounded as possible by whatever data or knowledge you have. It’s great to propose that your tool will save the world, but being unrealistic will cause problems later.
    • Making changes is okay. The canvas should be a flexible document. Just because something gets written down in a workshop doesn’t mean it is set in stone – in fact, it should reflect new information as you learn.

    Ready to get started?

    Atlantic BT has experience with Lean UX Canvases, along with full-fledged user research and testing. Contact us if you’re interested in seeing how we can help you.

  • Avoid Costly Patch Fixes By Planning for Accessibility [Video]

    Accessibility is important to ensure all of your visitors can use the features and content provided on your website. However, good accessibility can’t always be added as an afterthought – it must be built in from the start. Let me give you an example with a simple analogy.

     

    The Benefits of Considering User Requirements From the Start

    We recently renovated our bathroom – we took a decades-old, utilitarian bathroom and turned it into a bright, pleasant space with much more convenient features. As we were happily admiring the final results, we realized we had forgotten something. We have aging family members visit and wanted them to have a sturdy handrail for getting into and out of the combination tub/shower. We assumed the contractor could do this easily. Unfortunately, when we asked him to do it he said, “Well, it depends.”

    A handrail that will support body weight should be attached to a stud, not just to drywall. Therefore, the handrail had to go wherever the stud was; we couldn’t just put it anywhere. As it turns out, the stud wasn’t in the ideal location.

    Because we hadn’t planned ahead, we were faced with a choice. We could either provide a compromise solution, or tear out the work already done and put in a brace, then put the fixtures and tile back (an expensive and frustrating process). So, we learned our lesson the hard way: consider ALL requirements for ALL users from the start.

    The Consequences of Skipping Over Accessibility

    It may seem like it’s not a big deal to move the brace a few inches one way or the other, but we have to remember that for someone who needs it, proper location can be extremely important – improper placement could lead to imbalance, a fall, and a broken bone. This example is of physical accessibility in three-dimensional space – accessibility in the digital space has similar constraints and similar real-life consequences for those who need it.

    In the digital world, some accessibility considerations are easy to provide, such as providing alt text for all images. However, other concerns are much more difficult to implement. For example, presenting content in a meaningful sequence is very different for a vision-impaired person using an audio screen reader, than a fully-sighted user.

    Planning for Accessibility in Web Design

    So how do you plan for accessibility in your website redesign? Our UX and Design team members are certified Accessibility professionals and can work with you to identify accessibility needs and design them into your site from the beginning.

    We offer a free preliminary assessment – call us today to learn more.

  • How to Innovate in a Highly-Regulated Environment

    ABT helped Mutual Drug navigate a highly-regulated environment to provide a modern, user-friendly application which met and exceeded industry standards. Here’s how we modernized this healthcare website.

    Needed: A Secure and Streamlined Ordering System

    Pharmacists and pharmacy managers must maintain an inventory and order replenishment stock, just as any business selling physical products. However, pharmacies have the additional challenge of meeting the regulatory requirements of dealing with controlled substances (drugs that require a doctor’s permission to use). Specifically, any electronic ordering system they build or use must be compliant with the Controlled Substances Ordering System (CSOS) requirements of the Drug Enforcement Administration (DEA). This basically requires pharmacists to digitally sign orders for controlled substances in order to verify the authenticity of the order.

    Atlantic BT’s client, NC Mutual Drug, is a pharmaceutical distributor with $1.2B+ in B2B volume. Their existing system, while CSOS-compliant, was cumbersome to use and required logging in and navigating two different systems. The client tasked us with designing and building a new system that was secure, highly available, fault tolerant, fully compliant with CSOS requirements and, most importantly, simpler and faster to use than their previous system. Achieving these objectives made it easier for the client’s customers to place small orders more frequently, thus reducing the need for bulk orders and product stockpiling.  

    Performing 11 Validations without Losing Your Mind

    Conceptually, the technical challenge was straightforward: enable the standard required use of Public Key Infrastructure (PKI) to manage a system of digital signatures which could then be used to encrypt and ensure the authenticity and security of orders for controlled substances. This kind of technology is often integrated with web applications to facilitate the secure electronic transfer of information for a range of activities such as e-commerce, internet banking and confidential email.

    Straightforward, however, did not mean simple—we had to design, build, and test a robust, scalable, secure system that would perform eleven validations for each transaction, yet be simple and efficient for the user. After working closely with the client to understand all the usability and functional requirements, we proposed a design to meet their needs.

    Following the Rules, Even When They’re Old

    The real challenge was to implement this standard in a way that was efficient and intuitive yet compliant with standards written over a decade ago (and hence technologically outdated).

    Making matters even more complicated, the detailed requirements of implementing a CSOS-compliant system are scattered over 300+ pages of over a half-dozen government documents. On top of that, the final system would have to be certified by a 3rd-party auditor. Given the dispersed requirements and 3rd-party verification, development of a compliant CSOS system could become a very long, expensive process if not managed carefully.

    We needed to design a more modern web application which would perform both the client and server actions on a consolidated platform—while satisfying standards written more than 10 years ago. 

    Solution: Communicate, Iterate, and Evaluate

    To resolve any open questions, early in the process we contracted with an established 3rd party CSOS auditor to evaluate the application. Atlantic BT worked closely with the auditor to share documents and information so they could provide feedback on the development direction. Atlantic BT then performed multiple internal audits and tests to save our client the significant costs of multiple official audits.

    After extensive back-and-forth discussion with the client and the auditor, including a couple of challenges both to the requirements and to the proposed solution, all parties agreed a slight modification to ABT’s original design would meet both the client’s requirements and the standard. We built the system to the agreed-upon design, tested it, and had it evaluated by the auditors, who approved and certified the application as compliant.

    Result: Elegant Compliance Meets Streamlined Usability

    NC Mutual Drug now has a state-of-the-art solution for their customers to easily, securely place orders for their pharmaceuticals, including controlled substances. They can now rest assured they have a much more robust, fault-tolerant, scalable system that can easily grow with them into the future.

    Beyond stability and compliance, a validation process that formerly took 3+ minutes and multiple systems can now be completed in 30 seconds on a single interface. Considering NC Mutual Drug’s  operation runs hundreds of these processes every day, this exceptional boost in efficiency frees up member pharmacists to perform more important tasks to protect customer health.

    Get a more detailed look at the system Atlantic BT delivered by reading our in-depth writeup of Mutual Drug’s new CSOS system.