Livin’ on the “Edge” – Improve Developer Velocity with Fastly

Livin’ on the “Edge” – Improve Developer Velocity with Fastly

The rise of IoT (internet of things) has delivered new devices that make our lives better. For example, Alexa plays music, alerts us when a package has arrived, and settles dinner table disputes by giving us definitive answers. As a smart speaker owner, one of the most annoying aspects of this amazing device is latency. Ask a question that is sent to the cloud, interpreted, answered, and returned to the device; it may only take a few seconds, but in that time, our impatience grows. It’s crazy that we could possibly be annoyed with this magical device, but it’s human nature.

This problem will only get worse as the number of devices proliferate and the amount of data collected for processing increases. “Edge” computing has developed as a way to address issues like latency, bandwidth, and privacy issues. The “edge” is all about moving functionality closer to the source of data collection and consumption instead of relying on the cloud; moving data computation, machine learning, and AI to the outer edge, to the device itself.

All this talk of “edge” computing focuses on the outermost edge that is closest to the data source. There are inner edges as well that have been around for years. Some might argue that the “edge” is just a new term for what CDNs, content delivery networks, have been doing over the past 20 years. The major difference is while CDNs offer cached data, the “edge” provides both caching and processing. New companies like Fastly are offering an “edge” cloud network to help with performance, latency, and security.

Originally built to handle simple CRUD operations over the years, Lob’s API has expanded to include authentication, request enrichment, rate limiting, and much more. In other words, our API service had taken on characteristics of an API gateway. As we looked to improve the velocity of our engineering teams, we identified Fastly as a way to move many of these gateway features to the “edge.” This would free up teams to work more independently through a group of shared services.

What problem are we trying to solve?

Lob’s architecture was well-built initially but our scale has wildly outpaced the initial design. Such is the case with our main code base called lob-api —the nexus of all things Lob. If you want to verify an address, you go through the lob-api. You want to create a postcard? You want to update your account information? It all goes through the lob-api codebase. Lob-api isn't just doing our basic print and mail CRUD—it's also responsible for other ancillary level features—and in that way, it's performing more as an API gateway.

  • It’s performing request enrichment: Requests for US address verifications routed through lob-api are forwarded to our verification service. Lob-api identifies “this is an API call from a customer that has a credit card on file, the API key is X, and the mode (test or live).”
  • It’s performing authentication: Address Verification is a good example again since it relies on lob-api to identify whether a user is authenticated.
  • It also handles rate limiting.

Our move to the “edge'' extracts all of these API gateway level pieces of functionality and abstract them into a series of services that sits above the regular service stack.

What benefits does Fastly’s edge offer?

One improvement is how our engineering teams work without stepping on each other’s toes.

In the past, we regularly had merge embargos on the main branch, and that was not ideal. We want teams focused and operating independently, obviously not siloed, but in a non-blocking way. Our move to the “edge” improves developer velocity and productivity because we're not bogged down and issues won’t cascade to other issues.

The second improvement is speed. Fastly provides many extra features that enable us to go from zero to 60. No need to worry about dashboard authentication versus API key authentication or asking “which database table do I need to hit?” This is all handled via Fastly’s UI and plugins architecture.

Our Implementation

Our Edge Service launched with five functions: authentication, request enrichment, caching, rate-limiting and routing. A new Node.js service tackles authentication and enrichment while Fastly is responsible for caching, routing, and rate-limiting.

Outside the five services above, we faced another challenge: a large number of bad actors registering accounts in order to abuse our system. Fastly acquired a security startup called Signal Sciences with the ability to block bad actors at the Edge (rather than at the cluster). We implemented a block at the Fastly level while avoiding downtime for our API.

Fastly is based on Varnish, an open-source, stateless, edge caching layer. Fastly works well in a distributed way; across the entire network meaning you have a higher likelihood of having a cache hit. In terms of cache hit ratio, we believe Fastly tends to be superior to alternatives.

With that said, we continue to explore pathways to improve cache-ability of our resources. We are looking into our code base to find ways to increase performance. One challenge in Lob’s architecture is that some responses based on a user’s role return similar but different data for that same resource. In Fastly, there ends up being multiple instances for a given URL. Figuring out how to serve the correct data for a particular user can get tricky.

In the end, “edge” services at Lob fell into two camps. One is at the edge of our service cluster where all the little applications run, and the other is at the CDN edge. The challenge is figuring out what are the best use cases for those to maximize our potential. We believe most value for Lob will be at the cluster level.

Migration in progress

We selected our address verification product as the priority to provide more control over their call flow; this will reduce instances of downtime related to that flow. Address verification services are being removed from lob-api code base so Fastly edge can route traffic directly to address verification services. The result has been increased operational excellence of Lob’s API.

Lob also recently launched a product feature called Campaigns for marketers to upload addresses in bulk, select creative, and execute mail campaigns. Campaigns has its own service; instead of proxying through Lob API, it goes through the Edge service.

What’s next?

We identified Fastly as a way to move our primary gateway features to the “edge,” but Fastly offers a lot more than what we are using today. To improve caching of resources we plan to explore signing of asset links at the Edge. Currently, postcard responses include thumbnail URLs that are only valid for 24 hours; that is, the maximum amount of time that an object can live in the Fastly cache is 24 hours. A solution that utilizes Fastly involves not signing the asset URL: you cache the asset itself for weeks or even months, and Fastly runs an application to sign the request only upon request. Fastly has an Edge@compute platform and more research is needed to understand what value can be derived from it.

Our ultimate goal is to reduce the amount of reliance that we have on the lob-api code base, and one another. As Lob grows, the operational needs of the engineering teams grows with it. Minimizing reliance on a single team creates a bottleneck that impacts velocity; to be successful, each team must function more independently.

Special thanks to Adrian Falleiro, Staff Software Engineer on our Platform team for his contribution.