Resume

Wednesday, February 27, 2013

working Grunt into a primitive build workflow part 1

I'm sold on grunt, but I'm responsible for maintaining legacy JavaScript at work so now I'm trying to organize the code so I can use grunt as part of a build process.

The Old Way: download a fresh copy of our global.js script from production every time before commencing with a change. Make the modification by appending a new $(document).ready() block to the end of the file. Cross my fingers, and upload to production.

This is unsustainable. We are using JavaScript more and more every day, and depending on a household deity (me) to keep everything together by using hopes and prayers is going to fail dramatically one day.

Towards a permanent solution to this situation I looked into replacing our hand edits with a build process. I'm familiar with ant from a previous development position many years ago, but I wasn't satisfied with ant's support for web workflows. Enter grunt. Grunt is written in node, and is therefore installed through npm. There are dozens of plugins for grunt, but the ones I need are jshint, uglify, and concat. I'll want to add some more as I settle in with grunt (particularly qunit), but first things first.

I've been using jslint on my code, but apparently my settings have been fairly permissive. After getting my grunt configuration setup (Gruntfile.js), I setup a task list that contained jshint, concat, and uglify. Well, jshint just about had a heart attack. Part of this is because of our organization's extensive use of jQuery—most of our code just assumes it exists. You can tell jshint that you're making such an assumption, but it needs to be specifically stated in a comment block before the code that attempts to use jQuery.

I could force grunt to ignore the jshint errors, but what's the point of using a build tool if you're going to start developing bad habits? Now I have grunt installed, getting my code in better shape is my next step.

Once this is done we should have a nice, reliable, way of maintaining our global.js code. After that comes working in unit testing with qunit. It's all for a good cause. :)


Monday, February 18, 2013

Twitter bot in node.js

Dairy Godmother makes really awesome frozen custard. They have a special flavor of the day which they publish on their "Flavor Forecast" calendar.

Since we've had a long weekend, it was perfect for a project to learn some node.js, grunt, unit testing, and oauth for authenticating.

So things I learned:


  1. Writing unit tests for a module is tricky. Writing unit tests for asynchronous code is tricky. My tests are not very good at this point. I need to make 'em better. 
  2. Vis a vis the Twitter 1.1 API and node-oauth, this bug hung me up for a long time before I found this thread.  Actually I need to go back and tweak this more. Hm. 
  3. Grunt is pretty awesome once it gets running, but it's still kind of confusing. I had a grip on it by Saturday; however, today a major new version was released. Doh!
Yeah. It works, more work needs to be done. Since it's kind of a "fan" bot, I'd like it to be able to respond to the main Dairy Godmother Twitter account if it speaks to the bot. 

You can see the bot here: DGFlavor

I'll share my github repo once I do some QA and code review. 

 

Sunday, February 10, 2013

I am a responsive design skeptic

Responsive design came up in a few meetings at work this week. The most passionate advocate of responsive design presented some compelling arguments, but I remain a responsive design skeptic.

Responsive design is an approach to design websites in a way that the content "responds" to the resolution of the visitor's screen. Mobile browsers, tablets, and desktop browsers all use the same HTML source, but the layout is adjusted through CSS and maybe some JavaScript magic.

Sunday, February 3, 2013

1996: building an art gallery with basic tools

This morning I was thinking about an artist who contracted me to design a website back in 1996. There was no off the shelf CMS to use, and JavaScript was not implemented in a way that made for a useful client side technology. Although server side includes and CGI were available, the company I worked for preferred to avoid using technologies that might put extra load on the web server.

The number of pages for the project was projected to be well over 100. Up until this point I'd been working on web sites that were mostly under 10 pages—they were little more than digital brochures and an email link. Maintaining them by hand was a straight-forward affair. But this project was going to require discovering a new approach.

As a self-taught designer and web developer, I didn't have a great approach to gathering requirements for the site. Looking back, I realize that the site should have taken a minimalist approach, but I thought it would be neat to have a textured brick background with some subtle lighting effects to make it appear as though the artist's work was hanged on a wall. Yep, I was into skeuomorphism before skeuomorphism was cool.

The artist's work was arranged in five or six galleries, each composed of 20-100 works. She had slides of each painting. Using a slide scanner, the images were converted into high resolution jpegs and stored on my computer, an Amiga 3000.

The design specification was to create a homepage that listed the galleries, paginated galleries with thumbnails of the art, and a navigable slideshow for each gallery. This had to be managed with minimum SSI (header and footer only) and no CGI. Also, no CSS, which was nascent in 1996 and that I personally avoided until designing the Otakurama website in 2001/2002.

The first challenge was to convert 200+ images into thumbnail versions and slideshow versions. To tackle this I used ImageFX, which was the Amiga equivalent of Photoshop. The batch conversion capabilities of ImageFX were extremely powerful, so much so that even Photoshop CS3 isn't in the same league. I was able to use ARexx to perform all the image conversions and save them with a directory and filename schema that was compatible with the website architecture.

Next was building 200+ slideshow pages. For that I used an HTML preprocessor called hsc, (HTML Sucks Completely) which I'm amazed to discover is still available in 2013 (and there's even a little hsc tutorial!). Using hsc, I was able to define templates for the paginated gallery pages and the slideshows and then assemble everything by running a script. The output of hsc is a complete website ready for deployment.

It took a few times to get the process right, but eventually I got it down to the point where if a site-wide change was necessary I could just find the spot in the template where the HTML was defined and fix it once, recompile the site, and redeploy. I felt so clever at the time. :)

In retrospect, I understood the design of the site needed to be separated from the content. It wasn't the way I'd learned to build websites, but it was a much better approach. I wasn't able to communicate how powerful this development process was to my employer—heck, I didn't grasp the idea in full at the time myself—but I grew dissatisfied developing websites the "old way" and a year later left web development for a technical support position that paid better. Hindsight, 20/20 right?