In this article I would like to talk about Service Workers (SW). SW allow us to make our application offline ready so that it works even if we have no internet connection. They also allow us to use a lot of other enhanced features, like push notifications or background synchronization. SW live on even after you close the browser, that is the service worker still keeps on running. It’s a background process. So let’s register our first service worker.

(In this article I’ll implement functionality related to SW in plain JS because the code is written in plain JS we can integrate in any JS frameworks such as Angular, React or Vue)

As the first step let’s add a file sw.js to your root project folder. In app.js we have to check if service worker is available in navigator, that is if service worker is supported by the given browser. Now since we know that service workers are available, we can execute navigator.serviceWorker.register() method to register a new service worker pointing the path to a file where our service worker sits. This method actually returns a promise. So in order to get informed once it is done, we can chain then after it.

if ('serviceWorker' in navigator) {
  navigator.serviceWorker
    .register('/sw.js')
    .then(event => {
      console.log('Service worker registered', event);
    });
}

As we have registered our first service worker, let’s add our first event listener. As I said before, SW are running in the background. But I didn’t mention one thing that they’re all about handling events. To attach event listeners to the service worker we first of all have to refer to it with the self keyword that basically means “give me access to the service worker” and then we can execute addEventListener() method. In the service worker we have access to a special set of events, for example the install event which will be fired when the browser installs the service worker. There we execute a function, we’ll get an event object, which is passed into the function automatically by the browser and this object gives us information about the installation event. As we can see our service worker is successfully installed.

self.addEventListener('install', event => {
  console.log('Installing [Service Worker]', event);
});

Now we can start implementing static caching or precaching. The installation phase of a service worker is a great place to cache some assets which don’t change very often, like your app shell or basic styling. In the install event listener we write caches to get access to the cache API, then open() method passing a name of our cache. By means of that we open a new cache in there. But first we need to wrap this expression by event.waitUntil() expression. It simply waits until all the work with caching will be done. It won’t allow to finish the installation event. In then block we are accessing that cache and now can add content to it. With writing cache.addAll() we are adding the files which represent our app shell.

self.addEventListener('install', event => {
  console.log('Installing [Service Worker]', event);

  event.waitUntil(
    caches.open('static')
      .then(cache => {
        console.log('[Service Worker] Precaching App Shell');
        cache.addAll([
          '/',
          '/index.html',
          '/favicon.ico',
          '/src/js/app.js',
          '/src/js/chart.js',
          '/src/js/materialize.js',
          '/src/js/materialize.min.js',
          '/src/css/materialize.css',
          '/src/css/materialize.min.css',
          '/src/css/style.css',
          'https://fonts.googleapis.com/icon?family=Material+Icons',
          'https://code.jquery.com/jquery-2.1.1.min.js',
          'https://cdn.jsdelivr.net/npm/chart.js@2.8.0'
        ]);
      }));
});

Now let’s see if these files are actually loaded in cache storage. And indeed, they are.

As the next step we should retrieve that files from the cache in order to our app can work in offline mode. Let’s see how to do it. Another very important event we can also listen to is fetch event. Fetch will get triggered whenever our web application fetches something, for example css and js files or even xhr requests. So in the fetch event listener of the service worker, let’s make sure we actually fetch the data from our cache. I’m going to add that solution first and than explain how it works.

self.addEventListener('fetch', event => {
  event.respondWith(
    caches.match(event.request)
      .then(response => {
        if (response) {
          return response;
        } else {
          return fetch(event.request);
        }
      })
    );
});

The expression event.respondWith() allows us to overwrite the data which gets sent back. You can think of the service worker as a network proxy, at least if we use the fetch event here. So every outgoing fetch request goes through the service worker and so does every response. That is if we don’t do anything, the response is simply passed on and we won’t get any response. The expression cashes.match() allows us to check if given request is already cached. If so it’ll return the cached response. We’re not making a network request, we’re intercepting the request and we’re not issuing a new one, instead we’re just having a look at what we wanted to request and we see if it’s in the cache and if it is, we return it from there. On the other hand if we don’t find it in the cache, then we want to return the fetch request where we reach out or where we simply continue with the original request, so return fetch(event.request). After all these changes now we can finally use our web application in the offline mode. Volia 🙂


As you can see our web application contains a chart with some static data and nothing happens when you click the “GET DATA” button. Now I want to implement after clicking that button we’ll fetch some statistics data, display it on a chart and store that data into the cache. Thus we’ll implement dynamic caching. So let’s begin. Let’s say we have an endpoint that returns statistics data about how much users have visited our fake site. So now we have to grab that data and display it on a chart.

window.onload = _ => {
  const getButton = document.getElementById('get-button');

  getButton.addEventListener('click', async _ => {
    const res = await (await fetch('https://simple-pwa-8a005.firebaseio.com/data.json')).json();
  
    const sorter = {
      "monday": 1,
      "tuesday": 2,
      "wednesday": 3,
      "thursday": 4,
      "friday": 5,
      "saturday": 6,
      "sunday": 7
    };
    
    const tmp = [], orderedData = {}, pureData = {};
  
    Object.entries(res).map(([_, value]) => {
      Object.entries(value).map(([key, val]) => pureData[key] = val);
    });
  
    Object.keys(pureData).forEach(key => tmp[sorter[key.toLowerCase()]] = { key, value: pureData[key] });
  
    tmp.forEach(obj => orderedData[obj.key] = obj.value);
  
    const ctx = document.getElementById('myChart').getContext('2d');
  
    new Chart(ctx, {
      type: 'line',
      data: {
          labels: Object.entries(orderedData).map(([key, _]) => key),
          datasets: [{
              label: 'Users',
              backgroundColor: '#26a69a',
              borderColor: '#26a69a',
              fill: false,
              data: Object.entries(orderedData).map(([_, value]) => value),
          }]
      }
    });
  });
};


And as before I’m going to add the solution and explain how it works.

self.addEventListener('fetch', event => {
  event.respondWith(
    caches.match(event.request)
      .then(response => {
        if (response) {
          return response;
        } else {
          return fetch(event.request)
            .then(res => {
              return caches.open('dynamic')
                .then(cache => {
                  cache.put(event.request.url, res.clone());
                  return res;
                })
            });
        }
      })
    );
});

Dynamic caching simply means we have a fetch event anyways and we want to store the response which comes back in our cache. As before we write caches to get access to the cache API and open() method passing a name of our cache. The expression cache.put() just stores data you have. The first argument you pass to put is the event request URL, the identifier. The second argument is the response. So we’re storing an exact clone which is all we need, it contains all the response data but we’re returning the original response. That’s it. For the first time we grab statistics data from our server and store it in the cache. For the second time we’ll grab that data from cache. That solution works perfectly not only with the xhr requests. For example in this way we can dynamically cache css files or even images.

Fuuuh, that’s was a lot of new information.

By the way, and in addition I want to say some words about background synchronization. Background synchronization is all about sending data to a server when we have no internet connection. So how does it work behind the scenes? We can use a service worker to register a synchronization task. Now of course, registering the task alone isn’t everything we have to do, we also need to store the data we want to send with the request in an indexedDB. So if we didn’t have connectivity and it is re-established, the service worker will go ahead and execute that task immediately. So-called sync event will be executed on the service worker and we can listen to that event. The cool thing is that this will even work if we did close the tab or on mobile phones even the browser. Now I want to register first sync task and for that, I will first of all check if we do have access to service workers in a given browser. However, we also have to check if sync manager is available in window. The sync manager is basically the API through which we use the background synchronization features. Then I will reach out to my service worker and there I can call the ready property to make sure that it has been configured. So now we can work with the service worker. To register a new sync task we have to access sync property (this gives us access to the sync manager) and there we can call the register method. It takes only one argument and that is an ID that will identify a given synchronization task. So here, I’ll give it the name of “sync-request”. We’ll later use that name in the service worker to react to re-established connectivity and check which outstanding tasks we have and then we can use the tag to find out what we need to do with that task.

if ('serviceWorker' in navigator && 'SyncManager' in window) {
      navigator.serviceWorker.ready
        .then(sw => {
          sw.sync.register('sync-request')
        });
    }

Now I want to implement after clicking the “POST DATA” button we’ll store the data we want to send in an indexedDB and register a new sync task. For that we first of all should add some additional files in our project to easily work with an indexedDB. Then let’s create that data we want to store. It’ll be a simple object with two properties. The first property is the identifier. The second property is called “sunday” which value is 10 (for the sake of completeness our chart :)) To store that data we use a helper function called writeData from utility.js which takes two arguments. The first argument is the name of the database data will store and the second is our data itself. Аfter successful execution we are registering a new sync task.

const syncButton = document.getElementById('sync-button');

syncButton.addEventListener('click', _ => {
    if ('serviceWorker' in navigator && 'SyncManager' in window) {
      navigator.serviceWorker.ready
        .then(sw => {
          const data = {
            id: new Date().getTime(),
            sunday: 10
          };

          writeData('sync-requests', data)
            .then(_ => {
              sw.sync.register('sync-request')
            });
        });
    }
});

And finally we got to the most interesting part – listening to the sync event and reacting to the re-established connectivity accordingly. And as before I’m going to add the solution and explain how it works.

self.addEventListener('sync', event => {
  console.log('[Service Worker] Syncing');

  if (event.tag === 'sync-request') {
    event.waitUntil(
      readAllData('sync-requests')
        .then(async data => {
          const requests = [];

          for (const d of data) {
            requests.push(fetch('https://simple-pwa-8a005.firebaseio.com/data.json', {
              method: 'POST',
              headers: {
                'Content-Type': 'application/json',
                'Accept': 'application/json'
              },
              body: JSON.stringify({
                sunday: d.sunday
              })
            }));
          }

          const results = await Promise.all(requests);

          results.map((response, index) => {
            if (response.ok) {
              deleteItemFromData('sync-requests', data[index].id);
            }
          })
        })
    );
  }
});

First we have to check the event tag. Then I’m using event.waitUntil() just as before, this simply allows me to make sure that the event doesn’t finish
preemptively. Then we get the data we stored in indexedDB (using a helper function from utility.js), loop through it, send a post request for each of the data pieces be stored and then delete it from indexedDB if we successfully sent it to the server. That’s all. Let’s now try this out. To test that functionality we should go offline in our browser, click on the “POST DATA” button and then go online again.

After clicking on the “POST DATA” button when we’re offline nothing happens, but when connectivity is re-established we can see that syncing has been done.

And to confirm that data was really sent to server, we first need to delete our get request from the dynamic cache and click on the “GET DATA” button. Volia 🙂

That’s all for now guys. See ya later. My code is available on github: https://github.com/Draakan/simplePWA

Categories:

Tags:

No responses yet

Leave a Reply

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from Youtube
Vimeo
Consent to display content from Vimeo
Google Maps
Consent to display content from Google