Should we fear Webmatrix?

A post on Codebetter this morning about Webmatrix opined that Microsoft is lost because they’re pursuing Webmatrix when they should be pursuing the high-end developer market.

Webmatrix is an all-in-one environment that lets a new developer make websites using the Microsoft .NET stack.  Pre Webmatrix, if you wanted a website, you had to do the following:

  1. Download whichever web framework you wanted to use (ASP.NET MVC comes to mind).
  2. Install and configure SQL Server (or whatever database you wanted to use) so you could start local development.
  3. Configure the website to use the Database; and if you wanted to use a membership provider, dive down into the command line and remember the difference between -dd and -dt
  4. Set up IIS to publish the site to, and hope like hell you don’t have any weird ASP.NET MVC/IIS issues.
  5. Develop, Compile, Test, Pray (The Pray part is hoping you have the right permissions).
  6. Deploy it to shared hosting (most likely), and hope like hell you don’t have to go through steps 3-5 again in an environment you don’t control (I’m looking at you, GoDaddy)).

If I had known how painful it was to do that, I probably would have skipped Microsoft and went to another (free) stack as well. There are easily 10 hours in the time it takes to do all that for the first time (including research).  10 hours! 

For a new developer, The pain isn’t worth it.  That makes it even worse for someone who just wants a website, and doesn’t develop software for a living. This isn’t Xbox, there are no achievements and badges you get for doing this right, just the hope that you remember the deployment problems the next time you encounter them so you don’t spend as much time on it.  Do you think anyone outside of the software world wants to spend their time debugging permissions issues in IIS?

Microsoft has always had trouble in the ‘Get it working out of the box with no special futzing’ arena. This is their attempt to rectify those mistakes. 

I haven’t used Webmatrix, but from what I’ve heard it does a good job at alleviating the problems I listed, and that’s exactly what the Microsoft community needs right now.  The veterans don’t need it, but if you’re going to continue to throw money at a platform, you need to be sure that people will quickly choose that platform. If I had to guess I’d say this is Microsoft’s attempt to take some marketshare away from those frameworks and CMSs that make it trivially simple to set up a new site. 

Webmatrix can only help spread adoption of the .NET technology stack, and that’s a Good Thing™. We’re not at the point yet where it’s easy to build a WordPress replacement because we haven’t solved the deployment problem.

Kudos to Microsoft for seeing the issue and at least trying to fix it.

 

 

State of the Unplugged

Unplugging

Last September, after *several* painful encounters with Comcast, my fiance and I decided to go Over the Top.  No more cable.   With the new season of Glee and House and Burn Notice coming, it was a bad time to unplug. 

The first month we subsisted solely on Netflix streaming.  At $15.99, we were able to enjoy season one of Lie to Me, and assorted movies.  The selections for netflix have been getting better over the past few months, but in the beginning it was really rough.  Now I can enjoy Battlestar Galactica, Firefly, and Psych; while my fiance can enjoy Grey’s Anatomy.

Soon after we started, Hulu announced Hulu+, a $9.99 service that allows viewers to catch full seasons of certain shows.  For us, that’s Glee, House, Burn Notice, Psych, Grey’s Anatomy, and Private Practice. It scores in all those arenas.  But it’s not all sunshine and unicorns: Hulu Plus doesn’t yet support the Xbox360 as a device to watch TV on.  To get around this, we purchased TVersity Pro.

TVersity allows your computer to act as media center to stream online content to your Xbox360 or PS3. As I mentioned before, there are a few caveats to using TVersity: It doesn’t work for Hulu+ content, and it does have some bugs.  The newest version has fixed the issues I’ve had with TVersity, although it’s still disappointing that it doesn’t work with Hulu+.

To get around the “No Hulu+ on TV”, there are a few options:

  • The Roku Digital Media player. This 89.99 media player allows you to watch Hulu Plus, Amazon Unbox, and other content directly to your TV. This product currently holds the most promise, but if you own an Xbox or PS3, then the Roku isn’t needed.
  • Several TVs and other boxes.
  • Sony Playstation 3: Hulu+ ‘has an app for that’, and it allows you to watch TV shows on your PS3.

After doing the math, and realizing I wanted a PS3 anyway, we decided to buy a PS3 to support Hulu+ from the comfort of my couch. 

Disadvantages

Limited sports.  ESPN has a free channel on Xbox360 that shows college sports; and UEFA, but no American professional sports.  That’s the rub with being unplugged; it’s nearly impossible to watch a realtime NFL game or MLS Soccer.  MLS does have the ability to watch games online, but currently that’s only for PCs.  NFL has something similar, but the same problem exists. 

Netflix doesn’t carry everything, and Hulu+ doesn’t carry everything else.  To watch the most popular shows, you may need to jump between Amazon’s Unbox, Netflix, and Hulu+.  Unbox is all a la carte, which is both a good and bad thing.

Why Over the Top?

HD Cable costs $99.99 per month.  Before taxes, that’s $1200 a year, just on TV. If you watch 10 hours of TV a week, that’s about $2.30 for every hour of TV that you watch. Hulu+, on the other hand, costs $9.99 a month, and Netflix is $15.99 a month (with DVDs in the mail). That’s $36.00 per month, or $432 per year: 1/3 of the cost of Cable!  There’s also the signal to noise ratio on TV. When’s the last time you actually watched every channel you subscribe to? Does it make sense to pay for channels you don’t watch? Going unplugged is a reasonable way to save money and to break the cycle of mindless channel surfing.

Bottom Line

Unplugging isn’t for everyone. If you don’t like sports and all the shows you like are on Hulu+, Unbox, or Netflix, and you either like to watch TV on your PC or have a PS3, you’re ok. If you have an Xbox360, you should see Hulu+ soon (it still says ‘coming soon’ on their wesite).

If however, you love paying a lot for TV, or you like to watch sports, now isn’t the time to go Over the Top.  We plan on finishing our unplugged experiment by next September, let’s just hope they have NFL available online by then so we don’t have to.

I'm a Fool

After reading my previous posts about resigning, finding a home, and doing what I love, you might have thought I’d been thinking about a job change. You’d be right. 

For the past year I’ve been a Team Lead on a large software project, employed by a government contractor.  The project itself was an N-Tiered .NET Winforms solution in C# 2.0. I had originally taken the position because the cost of living in NoVA was that much different than in NC that telecommuting to a job in NC just wouldn’t work financially.  Protip: Never take a job for the money. After spending a year there and having all my personal projects being web projects (ASP.NET MVC), I realized that I wasn’t really happy not developing software day in and day out.  As a Team Lead, a lot more time is spent supervising or code reviewing, or planning the next iteration, or meeting with the customer.  On a good week, I got to spend 50% of my time working on a software issue, on a bad week, it was 0%.

Something had to give.

So I started to look on the Stack Overflow Job boards.  The reasoning being, if I’m going to find another job, I’m going to find a place that I can call home. Somewhere that scores high on the Joel Test, and somewhere where I want to stay. As if serendipity were looking down on me,  I found a job posting for the Motley fool.  Immediately the words “Blue Wizard needs Food badly” jumped out at me. These people spoke my language. It’s as if a million geeks cried out for joy all at once, and were not silenced.  So I excitedly put together a killer cover letter and my resume and sent it off to them.

An online skills assessment test, two phone screens, and a full day interview later, I was exhausted, nervous and hoping I didn’t blow it. 

Fast forward to three weeks later, to accepting an offer and finding this in my mail:

Jester cap

Since everyone’s doing the 3D thing, here’s a better view:

me in a jester cap, smiling stupidly.

If that silly picture of me wearing a jester’s cap doesn’t clue you in, let me spell it out for you: I’m now a Fool.  I’ll be working on the software that powers some of the Motley Fool’s online offerings.  I get to play with ASP.NET MVC, C#, and work with a whole bunch of smart and passionate software developers. What more could a guy ask for?

Here’s to finding a home.

 

Getting JQuery UI Autocomplete to work in ASP.NET MVC

I’ve spent the last few days messing around with the JQuery UI autocomplete. It’s a great little feature, but there’s just one problem:

There isn’t a complete tutorial of how to do it in ASP.NET MVC on the internet.  I know this because I searched like hell when it didn’t work for me.  Every example I looked at was missing one piece or another. 

In this blog post, I’m going to jot down what I did to get it working.

The first step is to write the Controller that will handle the calls.

public class QuestionsController : Controller
{
    IStackAPIRepository stackAPIRepository;
    public QuestionsController()
        : this(new StackAPIRepository()) { }
        
    public QuestionsController(IStackAPIRepository repository)
    {
        stackAPIRepository = repository;
    }

    ///<summary>
    /// Search Page used to find sites
    ///</summary> 
    public ActionResult Search()
    {
        return View();
    }
        
    ///<summary>
    ///Ajax call used to retrieve StackAuth Sites
    ///</summary>
    [HttpPost]
    public ActionResult Find(string term)
    {
        string[] sites = stackAPIRepository.FindSites(term);
        return Json(sites); 
    }
}

There are a few interesting parts to this controller, the first is the Dependency Injected repository. This allows me to switch out to a mock repository when testing. This is poorman’s dependency Injection, but for simple projects it is enough. When dealing with a larger project, it’s advisable to use a DI/IoC framework, like Spring.NET. The second part to pay attention to is the Action called Find.  This Action is going to be used to handle the AJAX calls.  The [HttpPost] is normally necessary to prevent Json Hijacking.  In this case it doesn’t really matter because the data isn’t sensitive (it’s just a JSON array of Stack Exchange sites), but I’m including it as an example (since I haven’t found any complete examples with HttpPost out there.

The Repositories aren’t really important to the example, as they just return a string Array that contains the site. For completeness, it is included:

The IStackAPIRepository Interface:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Stacky;

namespace EditIt.Models
{
    public interface IStackAPIRepository
    {
        IEnumerable<Question> GetQuestions(Stacky.Site site);
        IEnumerable<Question> GetQuestions(Stacky.Site site, string[] tags);
        string[] FindSites(string term);
    }
}

The Stacky reference to to the .NET API for Stack Exchange. You can download it on the Stacky codeplex site.

If I had written tests at this point, you’d probably notice that I forgot to abstract out the Stacky.Site so I could test it.  I haven’t actually implemented that yet.

The next code actually implements the interface.  There are two things worth noting in the code. First, I make all the actual calls to the Stack Exchange API here. That’s because it is the source of the data for my controller, hence it is the ‘Model’. More than that though, it makes a clean separation between layers. I don’t have controllers having to call an API for information, and coupling them in that fashion (if I did, it wouldn’t be very MVCish, would it?) The second part is that I also implement caching in this layer so that the API isn’t called every time someone wants to type in a site to look at.  The Stack Exchange sites don’t change that often, so the chances of a new site not being on the list are slim. At most they’ll be on the list a day after they go live, which is acceptable.

using System;
using EditIt.Controllers;
using EditIt.Utility;
using EditIt.Models;
using System.Collections.Generic;
using Stacky;
using System.Linq;
using System.Web;

public class StackAPIRepository : IStackAPIRepository
{
    private StackAuthClient authClient = new StackAuthClient(new UrlClient(), new JsonProtocol());

    public string[] FindSites(string term)
    {
        var sites = HttpRuntime
              .Cache
              .GetOrStore<string[]>(“sites”,
                 () => GetAllSites());

        if (string.IsNullOrEmpty(term))
        {
            return sites;
        }
        else
        {
            var items = ( from s in sites
                       where s.ToLower().Contains(term.ToLower())
                       select s).ToArray();
            return items;
        }
    }

    public string[] GetAllSites()
    {
        IEnumerable<Site> sites = authClient.GetSites();
        return (from s in sites
                select s.Name).ToArray();
    }
}

I’ve pulled out the other methods so that we can focus on the two methods that make things happen here.  The first is the GetAllSites method.  This is the method actually responsible for calling the Stack Exchange API.  Looking at it, it’s not very testable in its current state, but that’s an improvement I can make later.  I’m using LINQ to easily pull out the information I need.  I use the Set syntax because it’s easier for someone who isn’t familiar with LINQ to follow.  Every time I can use that syntax, I do.  The key is to keep the code readable, even by someone who may not know LINQ at first blush. 

The FindSites method first checks the Cache for whether or not someone has already polled the Stack Exchange sites. If they have, then we already have the entire array of sites to work with.  If they haven’t, it polls the API and then stores that into the Cache.  Here’s the GetOrStore extension method responsible for doing that, found from How Do I Cache Objects in ASP.NET MVC? from Stack Overflow.

I modified it a little according to the comments in the answer:

public static class CacheExtensions
{
    public static T GetOrStore<T>(this Cache cache, string key, Func<T> generator)
    {
        var result = cache.Get(key);
        if (result == null)
        {
            result = generator();
            cache.Insert(key, result, null, System.DateTime.UtcNow.AddDays(1), 
            System.Web.Caching.Cache.NoSlidingExpiration);
        }
        return (T)result;
    }
}

This allows me to easily add items to the cache as my application grows, without having to implement more than just this. It does add some coupling that will make testing harder, and in a future blog post I’ll go into getting around that.

The other potential problem with this approach is that it sets a hard deadline for cached items. No matter what, every item will be cached for a full day.  As my application grows, I may want to change that. For now it’s ok, though. YAGNI, and all that.

There’s an old saying by Carl Sagan: “If you want to make an apple pie from scratch, you must first create the universe.”  Now that we’ve created our universe, we can implement autocomplete. 

The toughest part of autcomplete is getting it to work.  Scattered among the dozens of examples on the internet autocomplete libraries that are no longer maintained.  I picked the JQuery UI Autocomplete plugin.  I’ve also chosen to pull the Jquery.js files from Google, instead of loading them locally.

The view is simple enough:

<%@ Page Title=”” Language=”C#” MasterPageFile=”~/Views/Shared/Site.Master” Inherits=”System.Web.Mvc.ViewPage” %>
<asp:Content ID=”Headcontent” ContentPlaceHolderID=”HeadContent” runat=”server”>
  <link href=”http://ajax.googleapis.com/ajax/libs/jqueryui/1.8/themes/base/jquery-ui.css” rel=”stylesheet” type=”text/css”/>
  <script src=”http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js” type=”text/javascript”></script>
  <script src=”http://ajax.googleapis.com/ajax/libs/jqueryui/1.8/jquery-ui.min.js” type=”text/javascript”></script>
   
    <script type=”text/javascript”>
        $.ajaxSetup({ type: “post” });
        $(document).ready(function() {
            $(“input#autocomplete”).autocomplete({
                source: ‘<%=Url.Action(“Find”, “Questions”) %>‘, minChars:3, delay: 500
            });
        });
  </script>
</asp:Content>

<asp:Content ID=”Content1″ ContentPlaceHolderID=”TitleContent” runat=”server”>
    Search
</asp:Content>

<asp:Content ID=”Content2″ ContentPlaceHolderID=”MainContent” runat=”server”>

    <h2>Search</h2>
    <p>
      <%= Html.TextBox(“autocomplete”)%>
    </p> 
</asp:Content>

The meat and potatoes of the autocomplete happens in the JQuery Script.  The  $.ajaxSetup({ type: “post” }); allows me to override the default behavior for AJAX and force it to use POST for requests.  As I stated earlier, this isnt’ really necessary for this particular script, but it would be necessary if I were using AJAX and were passing around sensitive information.  It doesn’t hurt anything to use it here, and it cuts down on the potential attack vectors.

The rest of the autocomplete is pretty straight forward, and there are dozens of examples of parameters to add on the example page.

There are a few gotchas to be aware of:

  • If the path to JQuery isn’t correct, or the script isn’t loading for some reason, the autocomplete will not work.  If you have Firebug, you can load it up in the ‘Scripts’ section, and it will report all the errors it gets in parsing the JavaScript. The other handy item is Fiddler Web Debugger. It allows you to inspect anything sent across HTTP, which was very useful in debugging problems with autocomplete.
  • Autocomplete sends the information to the url using the ‘term’ variable. : /Questions/Find?term=mytext. That means if your routing isn’t set up correctly, or if your actions aren’t looking for a parameter named ‘term’, it won’t work correctly.  This is a change from the old autocomplete, which used ‘q’.
  • Make sure the route is being sent across correctly. For my solution, I didn’t need to implement a route for this URL to work. With more complicated URLs, you may need to.

That’s really all there is to it, and the end result is a autocomplete that works correctly, the first time.