Filtered by JavaScript

Page 14

Reset

AngularJS $q notify and resolve as a local GET proxy

April 18, 2015
1 comment AngularJS, JavaScript

Something tells me there are already solutions like this out there that are written by much smarter people who have tests and package.json etc. Perhaps my Friday-brain failed at googling them up.

So, the issue I'm having is an angular app that uses a ui-router to switch between controllers.

In almost every controller it looks something like this:


app.controller('Ctrl', function($scope, $http) {
  /* The form that needs this looks something like this:
      <input name="first_name" ng-model="stuff.first_name">
   */
  $scope.stuff = {};
  $http.get('/stuff/')
  .success(function(response) {
    $scope.stuff = response.stuff;
  })
  .error(function() {
    console.error.apply(console, arguments);
  });
})

(note; ideally you push this stuff into a service, but doing it here in the controller illustrates what matters in this point)

So far so good. But so far so slow.

Every time the controller is activated, the AJAX GET is fired and it might be slow because of network latency.
And I might switch to this controller repeatedly within one request/response session of loading the app.

So I wrote this:


app.service('localProxy',
    ['$q', '$http', '$timeout',
    function($q, $http, $timeout) {
        var service = {};
        var memory = {};

        service.get = function(url, store, once) {
            var deferred = $q.defer();
            var already = memory[url] || null;
            if (already !== null) {
                $timeout(function() {
                    if (once) {
                        deferred.resolve(already);
                    } else {
                        deferred.notify(already);
                    }
                });
            } else if (store) {
                already = sessionStorage.getItem(url);
                if (already !== null) {
                    already = JSON.parse(already);
                    $timeout(function() {
                        if (once) {
                            deferred.resolve(already);
                        } else {
                            deferred.notify(already);
                        }
                    });
                }
            }

            $http.get(url)
            .success(function(r) {
                memory[url] = r;
                deferred.resolve(r);
                if (store) {
                    sessionStorage.setItem(url, JSON.stringify(r));
                }
            })
            .error(function() {
                deferred.reject(arguments);
            });
            return deferred.promise;
        };

        service.remember = function(url, data, store) {
            memory[url] = data;
            if (store) {
                sessionStorage.setItem(url, JSON.stringify(data));
            }
        };

        return service;
    }]
)

And the way you use it is that it basically returns twice. First from the "cache", then from the network request response.

So, after you've used it at least once, when you request data from it, you first get the cached stuff (from memory or from the browser's sessionStorage) then a little bit later you get the latest and greatest response from the server. For example:


app.controller('Ctrl', function($scope, $http, localProxy) {
  $scope.stuff = {};
  localProxy('/stuff/')
  .then(function(response) {
    // network response
    $scope.stuff = response.stuff;
  }, function() {
    // reject/error
    console.error.apply(console, arguments);
  }, function(response) {
    // update
    $scope.stuff = response.stuff;
  });
})

Note how it sets $scope.stuff = response.stuff twice. That means that the page can load first with the cached data and shortly after the latest and greatest from the server.
You get to look at something whilst waiting for the server but you don't have to worry too much about cache invalidation.

Sure, there is a risk. If your server response is multiple seconds slow, your user might for example, start typing something into a form (once it's loaded from cache) and when the network request finally resolves, what xhe typed in is overwritten or conflicting.

The solution to that problem is that you perhaps put the form in a read-only mode until the network request resolves. At least you get something to look at sooner rather than later.

The default implementation above doesn't store things in sessionStorage. It just stores it in memory as you're flipping between controllers. Alternatively, you might want to use a more persistent approach so then you instead use:


controller( // same as above
  localProxy('/stuff/', true)
  // same as above
)

Sometimes there's data that is very unlikely to change. Perhaps you just need the payload for a big drop-down widget or something. In that case, it's fine if it exists in the cache and you don't need a server response. Then set the third parameter to true, like this:


controller( // same as above
  localProxy('/stuff/', true, true)
  // same as above
)

This way, it won't fire twice. Just once.

Another interesting expansion on this is, if you change the data after it comes back. A good example is if you request data to fill in a form that user updates. After the user has changed some of it, you might want to pre-emptivly cache that too. Here's an example:


app.controller('Ctrl', function($scope, $http, localProxy) {
  $scope.stuff = {};
  var url = '/stuff/';
  localProxy(url)
  .then(function(response) {
    // network response
    $scope.stuff = response.stuff;
  }, function() {
    // reject/error
    console.error.apply(console, arguments);
  }, function(response) {
    // update
    $scope.stuff = response.stuff;
  });

  $scope.save = function() {
      // update the cache
      localProxy.remember(url, $scope.stuff); 
      $http.post(url, $scope.stuff);
  };
})

What do you think? Is it useful? Is it "bonkers"?

I can think of one possible beautification, but I'm not entirely sure how to accomplish it.
Thing is, I like the API of $http.get that it returns a promise with a function called success, error and finally. The ideal API would look something like this:


app.controller('Ctrl', function($scope, $http) {
  $scope.stuff = {};
  // angular's $http service expanded
  $http.getLocalProxy('/stuff/')
  .success(function(cached, response) {
    /* Imagine something like:
        <p class="warning" ng-if="from_cache">Results you see come from caching</p>
     */
    $scope.from_cache = cached;
    $scope.stuff = response.stuff;
  })
  .error(function() {
    console.error.apply(console, arguments);
  });
})

That API looks and feels just like the regular $http.get function but with an additional first argument to the success promise callback.

Match the whole word in auto complete maybe

April 10, 2015
2 comments Web development, JavaScript

(For context, I released Autocompeter.com last week and now I'm thinking about improvements)

I posted a question on Twitter about which highlighting formatting people prefer and got some interesting feedback. More about that later.

The piece of feedback that really got my attention came from my friend Honza Král.
He wondered if not the whole word should be highlighted instead of just the beginning of the word.

I've actually been thinking about that too but never got around to trying it out. Until now.

Before

Before

After

After

What do you think?

I have the code in a branch and I'm still mulling it over. There's sort of a convention to just highlight based on what you've typed so far. I don't want to be too weird because when people don't feel familiar they don't like what they see even if the new actually is better.

gulp-header is the best!

April 9, 2015
1 comment JavaScript

For Autocompeter I develop with gulp. It's like Grunt but better.

One thing I wanted was that when it makes the src/autocompeter.js --> minify() --> dist/autocompeter.min.js step I also wanted to put in a little preample header into the minified file.

First I thought, since UglifyJS supports a --preamble option that that'd be the route to go. I didn't get very far.

Then I thought I had to write my own plugin. So I started reading the documentation about how to write a plugin and partially thinking "Oh I don't have time to do this" and also "Oh finally a chance to sit down and really understand how gulp plugins work". I was wrong. The documentation for writing plugins say:

"Your plugin shouldn't do things that other plugins are responsible for... ...It should not add headers, gulp-header does that"

Oh! So there is already a great plugin for this! Long story short; here's how I used it. The output is that the version number is now on the first line of autocompeter.min.js.

I'm starting to like gulp more and more. There's even a dedicate nice index of all available plugins.

Autocompeter 1.1.8 and smooth typing

April 6, 2015
0 comments Go, JavaScript

One of the most constructive pieces of feedback I got when Autocompeter was on Hacker News was that when you type something with lots of predictable results the results overlay would "flicker".

E.g. you type "javascript" in a nice and steady pace and the overlay would shrink and grow and shrink and grow very rapidly.

The reason it happened was due to a bug in the javascript code that filtered results whilst waiting for the next AJAX request from the latest typed character. E.g. you type "ash" and the results comes back with "ashley", "ashes", "Ashford". Then you add a "l" so now we start a new AJAX query for "ashl" and whilst waiting for that output from the server we can start filtering out "ashes" and "Ashford" because we can pre-emptively know that that won't be in the new result set.

The bug was a bad function that filtered the existing results on a second rendering whilst waiting for the next AJAX. It was easy to fix and this is included in version 1.1.8.

The reason I failed to notice this was because I had inserted some necessary optimizations when the network latency was very very slow but hadn't tested it in a realistic network latency environment. E.g. a decent DSL connection but nevertheless something more advanced that just connecting to localhost.

Autocompeter.com

April 2, 2015
8 comments Go, JavaScript

About a year ago I found a great article about how to use Redis to index every prefix of every word as an index and thus a super fast way of building an autocomplete service. The idea is that you take all your titles and index them like this; if the title is "My Title" you store a key for m, my, t, ti, tit, titl and title. That means you can do very fast lookups as someone is typing unfinished words.

Anyway. I was running this merrily here on my personal blog but I liked it so much and I wanted to use it on aother site for work that I thought it'd be time to extract it into its own little microservice. All I needed was a name and my friend and colleague jezdez suggested I call it "autocompeter". So that it became.

The original implementation was written in Python and at the time I was learning Go and was eager to have something to build in Go. So I built this microservice in Go using the Negroni web framework.

The idea is that you own and run a website. You have a search feature on your website but you don't have a nifty autocomplete (aka. live search) thing on it. So, you send me all your titles, URLs and optionally their "popularity ranking" (basically a score). I'll index them on autocompeter.com under your domain. You have to sign in with GitHub to set up an API Auth Key.

Then, you put this into your HTML:


<script src="//cdn.jsdelivr.net/autocompeter/1/autocompeter.min.js"></script>
<script>
Autocompeter(document.querySelector('input[name="q"]');
</script>

Also, you'll need to download the CSS and put into your site. I don't recommend pointing to a CDN for CSS.

And that's all you have to do. The REST API, options for the Javascript integration and CSS integration in the documentation.

The Javascript is framework-free meaning it's just pure DOM manipulation and works in IE and modern browsers. The minified file is only 4.2Kb minified (2Kb gzipped).

All code is Open Source under a BSD license. Everything is free but there's no SLA as of yet.

I'm going to be blogging more and more about feature development, benchmarks and other curious things I learn developing this further.

Median size of Javascript libs on jsDelivr

February 24, 2015
0 comments JavaScript

If you haven't heard of jsDelivr then you've missed out on a great project! It's basically a free CDN to use for Open Source projects. I added my own yesterday and it was as easy as making a pull request with the initial file, some metadata and a file that tells them where to pick up new versions from (e.g. GitHub).

Anyway, they now host A LOT of files. 8,941 to be exact (1,927 unique file names), at the time of writing. So I thought I'd check out what the median size is.

The median size is: 7.4Kb

The average is 28.6Kb and the standard deviation is 73Kb! so we can basically ignore that and just focus on the median.

Check out the histogram

That's pretty big! If you exclude those bigger than 100Kb the median shrinks to 6.5Kb. Still pretty big.

I'm proud to say that my own is only 50% the size of the median size.

AJAX or not

December 22, 2014
1 comment Web development, AngularJS, JavaScript

From the It-Depends-on-What-You're-Building department.

As a web developer you have a job:

  1. Display a certain amount of database data on the screen
  2. Do it as fast as possible

The first point is these days easily taken care of with the likes of Django or Rails which makes it über easy to write queries that you then use in templates to generate the HTML and voila you have a web page.

The second point is taken care of with a myriad of techniques. It's almost a paradox. The fastest way to render something on the screen is to generate everything on the server and send it wholesome. It means the browser can very quickly (and boosted by GPU) render something on the screen. But if you have a lot of data that needs to be displayed it's often better to send just a little bit of HTML and then let some Javascript kick in and take care of extracting the rest of the information using AJAX.

Here I have prepared three different versions of ways to display a bunch of information on the screen:

https://www.peterbe.com/ajaxornot/

Visual comparison on WebPagetest
What you should note and take away from this little experimental playground:

  1. All server-side work is done in Django but it's served straight out of memcache so it should be fast server-side.

  2. The content is NOT important. It's just a list of blog posts and their categories and keywords.

  3. To make it somewhat realistic, each version needs to 1) display a JPG and 2) have a Javascript onclick event that throws a confirm() dialog box.

  4. The AngularJS version loads significantly slower but it's not because AngularJS is slow, but because it's able to do so much more later. Loading a Javascript framework is like an investment. Big cost upfront and small cost later when you need more magic to happen without having a complete server refresh.

  5. View 1, 2 and 3 are all three imperfect versions but they illustrate the three major groups of solving the problem stated at the top of this blog post. The other views are attempts of optimizations.

  6. Clearly the "visually fastest" version is the optimization version 5 which is a fork of version 2 which loads, on the server-side, everything that is above the fold and then take care of the content below the fold with AJAX.
    See this visual comparison

  7. Optimization version 4 was a silly optimization. It depends on the fact that JSON is more "compact" than HTML. When you Gzip the content, the difference in size doesn't matter anymore. However, it's an interesting technique because it means you can do all business logic rendering stuff in one language without having to depend on AJAX.

  8. Open the various versions in your browser and try to "feel" how pages the load. Ask your inner gutteral heart which version you prefer; do you prefer a completely blank screen and a browser loading spinner or do you prefer to see some skeleton structure first whilst waiting for the bulk content comes in?

  9. See this as a basis of thoughts and demonstration. Remember the very first sentence in this blog post.

One-way data bindings in AngularJS 1.3

December 11, 2014
1 comment AngularJS, JavaScript

You might have heard that AngularJS 1.3 has "one-time bindings" which is that you can print the value of a scope variable with {{ ::somevar }} and that this is really good for performance because it means that once rendered it doesn't add to the list of things that the angular app needs to keep worrying about. I.e. it's one less thing to watch.

But what's a good use case of this? This is a good example.

Because ng-if="true" will cause the DOM element to be re-created it will go back to the scope variable and re-evaluate it.

To then() or to success() in AngularJS

November 27, 2014
19 comments AngularJS, JavaScript

By writing this I'm taking a risk of looking like an idiot who has failed to read the docs. So please be gentle.

AngularJS uses a promise module called $q. It originates from this beast of a project.

You use it like this for example:


angular.module('myapp')
.controller('MainCtrl', function($scope, $q) {
  $scope.name = 'Hello ';
  var wait = function() {
    var deferred = $q.defer();
    setTimeout(function() {
      // Reject 3 out of 10 times to simulate 
      // some business logic.
      if (Math.random() > 0.7) deferred.reject('hell');
      else deferred.resolve('world');
    }, 1000);
    return deferred.promise;
  };

  wait()
  .then(function(rest) {
    $scope.name += rest;
  })
  .catch(function(fallback) {
    $scope.name += fallback.toUpperCase() + '!!';
  });
});

Basically you construct a deferred object and return its promise. Then you can expect the .then and .catch to be called back if all goes well (or not).

There are other ways you can use it too but let's stick to the basics to drive home this point to come.

Then there's the $http module. It's where you do all your AJAX stuff and it's really powerful. However, it uses an abstraction of $q and because it is an abstraction it renames what it calls back. Instead of .then and .catch it's .success and .error and the arguments you get are different. Both expose a catch-all function called .finally. You can, if you want to, bypass this abstraction and do what the abstraction does yourself. So instead of:


$http.get('https://api.github.com/users/peterbe/gists')
.success(function(data) {
  $scope.gists = data;
})
.error(function(data, status) {
  console.error('Repos error', status, data);
})
.finally(function() {
  console.log("finally finished repos");
});

...you can do this yourself...:


$http.get('https://api.github.com/users/peterbe/gists')
.then(function(response) {
  $scope.gists = response.data;
})
.catch(function(response) {
  console.error('Gists error', response.status, response.data);
})
.finally(function() {
  console.log("finally finished gists");
});

It's like it's built specifically for doing HTTP stuff. The $q modules doesn't know that the response body, the HTTP status code and the HTTP headers are important.

However, there's a big caveat. You might not always know you're doing AJAX stuff. You might be using a service from somewhere and you don't care how it gets its data. You just want it to deliver some data. For example, suppose you have an AJAX request cached so that only the first time it needs to do an HTTP GET but all consecutive times you can use the stuff already in memory. E.g. Something like this:


angular.module('myapp')
.controller('MainCtrl', function($scope, $q, $http, $timeout) {

  $scope.name = 'Hello ';
  var getName = function() {
    var name = null;
    var deferred = $q.defer();
    if (name !== null) deferred.resolve(name);
    $http.get('https://api.github.com/users/peterbe')
    .success(function(data) {
      deferred.resolve(data.name);
    }).error(deferred.reject);
    return deferred.promise;
  };

  // Even though we're calling this 3 different times
  // you'll notice it only starts one AJAX request.
  $timeout(function() {
    getName().then(function(name) {
      $scope.name = "Hello " + name;
    });    
  }, 1000);

  $timeout(function() {
    getName().then(function(name) {
      $scope.name = "Hello " + name;
    });    
  }, 2000);

  $timeout(function() {
    getName().then(function(name) {
      $scope.name = "Hello " + name;
    });    
  }, 3000);
});

And with all the other promise frameworks laying around like jQuery's you will sooner or later forget if it's success() or then() or done() and your goldfish memory (like mine) will cause confusion and bugs.

So is there a way to make $http.<somemethod> return a $q like promise but with the benefit of the abstractions that the $http layer adds?

Here's one such possible solution maybe:


var app = angular.module('myapp');

app.factory('httpq', function($http, $q) {
  return {
    get: function() {
      var deferred = $q.defer();
      $http.get.apply(null, arguments)
      .success(deferred.resolve)
      .error(deferred.resolve);
      return deferred.promise;
    }
  }
});

app.controller('MainCtrl', function($scope, httpq) {

  httpq.get('https://api.github.com/users/peterbe/gists')
  .then(function(data) {
    $scope.gists = data;
  })
  .catch(function(data, status) {
    console.error('Gists error', response.status, response.data);
  })
  .finally(function() {
    console.log("finally finished gists");
  });
});

That way you get the benefit of a one same way for all things that get you data some way or another and you get the nice AJAXy signatures you like.

This is just a prototype and clearly it's not generic to work with any of the shortcut functions in $http like .post(), .put() etc. That can maybe be solved with a Proxy object or some other hack I haven't had time to think of yet.

So, what do you think? Am I splitting hairs or is this something attractive?

A "perma search" in AngularJS

November 18, 2014
0 comments AngularJS, JavaScript

A common thing in many (AngularJS) apps is to have an ng-model input whose content is used to as a filter on an ng-repeat somewhere within the page. Something like this:


<input ng-model="search">
<div ng-repeat="item in items | filter:search">...

Well, what if you want the search you make to automatically become part of the URL so that if you bookmark the search or copy the URL to someone else, the search is still there? It would be really practical. Granted, it's not always that you want this but that's something you can decide.

AngularJS 1.2 (I think) introduced the ability to set reloadOnSearch: false on a route provider and that means that you can do things like $location.hash('something') without it triggering the route provider to re-map the URL and re-start the revelant controller.

So here's a good example of (ab)using that to do a search filter which automatically updates the URL.

Check out the demo: https://www.peterbe.com/permasearch/index.html

This works in HTML5 mode too if you're wondering.

Suppose you use many more things in your filter function other than just a free text ng-modal. Like this:


<input type="text" ng-model="filters.search">
<select ng-model="filters.year">
<option value="">All</option>
<option value="2014">2014</option>
<option value="2013">2013</option>
</select>

You might have some checkboxes and stuff too. All you need to do then is to encode that information in the hash. Something like this might be a good start:


$scope.filters = {};
$scope.$watchCollection('filters', function(value) {
    $location.hash($.param(value)); // a jQuery function
});

And something like this to "unparse" the params.