Optimizing Elixir Phoenix action with huge json response by responding by cached, gzipped values.

Alexandr Korsak
2 min readJul 19, 2023

#elixir #phoenix #phoenixframework

Hello,

It’s great to be back, and I’m excited to share the recent code changes I’ve implemented for my clients. We’ve been working on a mobile react native application that includes a wiki-like section, providing users with valuable step-by-step information to improve their health.

One of the challenges we faced was dealing with a massive JSON response, roughly 20MB in size, retrieved from a single endpoint during the app’s loading process. While we could split API requests and load content on-demand, we also wanted to ensure that users could access the app offline, making a one-time load during the initial launch beneficial.

The loading process was taking around 6 seconds, depending on the internet speed, and I decided to implement a quick fix, particularly for the wiki endpoints.

In our Elixir environment, we found the perfect solution in the cachex library. This impressive library features a warmer module, which automatically refreshes cached data either on boot or after a specified ttl (time-to-live).

By integrating cachex into our system, we’re now able to optimize the loading process, providing users with a smoother experience while still benefiting from offline accessibility. Stay tuned as I delve deeper into the caching practices that helped us enhance the performance of our Phoenix Controller actions with JSON responses.

defmodule BlogApp.Cache.PostsWarmer do
use Cachex.Warmer

alias BlogApp.Posts

require Logger
require Jsonrs

@cache_table :blog_app_cache
@posts_cache_key {:posts, :list_posts}

def interval, do: :timer.minutes(60)

def execute(_args) do
data = [
get_posts()
]

{:ok, data, [ttl: :timer.minutes(60)]}
end

def get_cached_posts() do
get_or_put(@posts_cache_key)
end

defp get_posts() do
posts = Posts.list_published_posts()

posts =
BlogAppWeb.Api.V1.PostsView.render("index.json", %{
posts: posts
})
|> Jsonrs.encode!()
|> :zlib.gzip()

{@posts_cache_key, posts}
end

defp get_or_put(key) do
case Cachex.get(@cache_table, key) do
nil ->
case key do
@posts_cache_key ->
{key, posts} = get_posts()

Cachex.put(@cache_table, key, posts)
Cachex.get(@cache_table, key)

_ ->
Logger.error("[your_app] cache key not found: #{inspect(key)}")

nil
end

value ->
value
end
end
end

Create cache.ex in the lib folder:

defmodule BlogApp.Cache do
@moduledoc """
Cache
"""
@cache_table :blog_app_cache

import Cachex.Spec

def child_spec(_init_arg) do
%{
id: @cache_table,
type: :supervisor,
start:
{Cachex, :start_link,
[
@cache_table,
[
warmers: [
warmer(module: BlogApp.Cache.PostsWarmer, state: "")
]
]
]}
}
end
end

Add this warmer module to application.ex:

defmodule BlogApp.Application do
@moduledoc false

use Application

def start(_type, _args) do
children = [
...
{BlogApp.Cache, []}
]

Supervisor.start_link(children, strategy: :one_for_one, name: BlogApp.Supervisor)
end
end

And now you are ready to integrate this cached, gzipped data in the Phoenix controller.

def index(conn, _params) do
{:ok, posts} = PostsWarmer.get_cached_posts()

conn
|> put_resp_header("Content-Encoding", "gzip")
|> put_resp_content_type("application/json")
|> send_resp(200, posts)
end

It’s a short but powerful speed-up. Also, I’ve just started to use it for the huge JSON files jsonrs.

Thank you for reading!

Available for consulting Elixir, Go, JS, and Big Data. My website: bitscorp.co and my Github: github.com/oivoodoo

--

--