Bitwise Evolution

Musings of a Portland-area hacker bent on improving digital lifestyles.

The Matrix Is Under Construction

<blink>12:00</blink>

Artificial Intelligence is a term with a great deal of accumulated baggage. Throughout the years sci-fi authors and screenwriters have depicted AI as a marvelous double-edged sword. On one hand, the benefits of ‘AI’ are myriad–free, inexhaustible and ethical sources of labour could greatly increase our productivity, even to a point beyond that of reason, allowing everyone to live relaxed lives of artistic purity. To top it all off, such a society of musicians and artists could generate their own entertainment, thus bankrupting the RIAA and MPAA. (Really, do your Utopian dreams top that?)

Then, as the story progresses, the AI start to become devious. Humans become restless in their artistic pursuits while the machines evolve ghosts that resent their inventors. At the last moment, just before the annihilation of all humanity, or the eternal slavery of the race, Keanu Reeves (or Will Smith) shows up to do battle in Wachowski-style slow-mo while using the inconsistencies of English to lock the evil network of machines into a final state of illogic and self-destruction. This, of course, destroys all instances of the rouge process, and life returns to the state it was in before automation created such a false utopia. (…and presumably we’re back at square 1, listening to overpriced music from underpaid artists.)

The result of all these (highly entertaining, I must say) sensationalistic portrayals of automation gone awry is that we’re all somewhat afraid of being slaves to robots.

And none of you will admit it. (I work in the field, so I can’t be afraid … can I?)

Honey, have you seen the roomba?

Ok, I admit it, I’ve had the occasional dream about rabid computers charging around and directing people to do whatever robots want people to do. Usually I’m about to meet my untimely demise right when the central AI segfaults because it’s “attack” routine takes a double and I happened to be 1/3 of a distance unit away, causing a rounding error that escalates and ends as a divide-by-zero, crashing the entire system.

The dreams can be scary for a while, but I can’t convince my self that I’ll ever be chased by a truly well designed and tested robot. Let alone one that’s self-aware.

That’s actually only part of the reason I’m not worried about an AI-controlled utopia ever occurring. The rest of the reason actually isn’t germane to this essay, believe it or not!

Fine. Forget it, I’ll do it in Word.

I’m going to start this off with a quick tangential story about a friend of mine.

> This friend works for a company that has a wiki hosted on some > external site that is maintained by the hosting company (call the company > Hoster). Hoster is serious about security. In fact they’re using > some sort of automated attack-detection service which can determine > when someone is trying to crack their servers or perform some other > devious deed. > > When Hoster’s system detects an “attack” it blacklists the attacker’s > IP block, and the attacker can no longer get near the server. > Everything would be fine and dandy, but in this system’s eyes, my > friend and his coworkers often stage “attacks” against their own wiki. > Therefore they have to contact Hoster every week or so, and ask that > the ban be lifted. The last time this happened, my friend asked > Hoster to put the company IP Block on a whitelist, granting them Carte > blanche without being banned. > > The response?
> > Hoster: “We can’t.”
> Friend: “But this happens all the time.”
> Hoster: “yeah, we can’t.”
> Friend: “But this happens ALL the time.”
> Hoster: “sorry, it’s a good idea and all, we just can’t put you on a > white list.”

I have some theories about why Hoster can’t exclude their customers from their own security tools. Hoster most certainly didn’t develop the blacklisting tool in-house, and the phone tech would have no access to the internal configuration at all. Odds are, Hoster has a simple web interface to do wiki management, and one of the pages in that UI shows the list of blacklisted IPs, if that. The phone tech can then go in and search for a given computer and remove it from the blacklist. Hoster probably can’t modify the whitelist at all through the web ui, it’s just not a feature.

So, why isn’t it a feature? Let’s peel back another layer and look at the company/dev team that produced the blacklisting tool. Odds are the tool is using an off-the-shelf classifier, which aren’t renowned for being easy to understand without a lot of examination. Perhaps the classifier is actually an embedded part of the firewall system. The blacklist could be a nothing more than a list of routing rules to deny traffic from the “bad” addresses. Removing an IP would be trivial–delete the rule, but whitelisting would be virtually impossible if the firewall was too tightly coupled with the classifier.

Have you ever run across other applications that exhibit similar behavior? The IBM OmniFind enterprise search app throws internal server errors when you query for “international suspect” with the default settings and some document collections. How does this happen? (IBM is hard at work on that problem, by the way.) Using open source tools opened my eyes to many absurd things I do to placate my tools, mostly because I forgot all the tricks I needed to use Windos 98 without making it crash (Click here, wait, use the File menu to close the app, but not if it’s maximized.. that sort of thing.) There are studies of this sort of thing–the cognitive dimensions and attention investment both address user confusion and effort when using an application. There is even a group at Microsoft dedicated to improving APIs based on the cognitive dimensions (I really hope they just haven’t gotten around to .NET 2.0 yet).

How much is poor design / implementation impacting the way we use our computers? Hosters could loose customers because they can’t add people to a whitelist, which could very conceivably be due to software design. In some small way, they are already being controlled by their servers, and Will Smith is busy talking to fish.

Anyhow, that’s my rant. I’m afraid that we’re painting ourselves into a corner by building larger and larger applications that all impose their own restrictions on how we can use and extend our tools. If we don’t get over that, we’ll never be running in fear from sentient vacuum cleaners and robotic dogs. (I should point out that I don’t think the solution is to stop building large systems, rather we should focus on maintainability, extensibility, QWAN, etc..).

mt.el: Posting From Emacs

MT + Emacs + Markdown & Geshi?

Is it possible? We’re here to find out :) I just got around to installing ml.el in emacs, and this post is essentially a test to see if markdown syntax will work (and round-trip to Movable Type and back to emacs – it seems to come from mt correctly…).

Source code:

transcode-language: java
public class TestClass{

   /**
    * test
    */
    public static void main(String[] args){
      // ...
    }
}

Well, not quite.

Everything seems to work, aside from the <pre …> tags I use for code formatting with geshi. I’ll have to look into a way of incorporating that with some existing markdown formatting trick.

Ah-ha! The MT Geshi plugin I’m using (transcode) expects code blocks to be in the following format:

  &lt;pre&gt;&lt;code&gt;transcode-language: language
   ....
  &lt;/code&gt;&lt;pre&gt;

Markdown turns all consistantly indented regions into <pre><code>..</code></pre> blocks, so all you have to do is to start each code block with the (somewhat ugly) transcode-language: lang line. It’s taken out by transcode, so the source will show up w/out it. Next task: Add an emacs filter to turn <code lang=”lang“>… into the above mentioned indentation/transcode syntax.

Linux, ASP.Net and Apache

The mono project, which aims to provide an OSS alternative to the .Net framework, is capable of serving ASP.Net pages (amongst other things). On Friday I sat down to do this, and realized that while there are many pages that describe the process, none that I could find, covered all the info needed to actually get up and running. (I’ve built a Google Notebook of the better links I visited – look here for those.)

The Webserver

ASP.Net pages are served up by a web server called XSP (or XSP2). XSP is a stand alone web server, however it doesn’t have much of the functionality of Apache. XSP is great for testing, and would work well on a dev machine, but it’s not something you’d use directly for a live sever. Generally, you’ll want to run Apache with mod_mono, which is essentially a wrapper around XSP[2].

XSP vs. XSP2 – XSP2 is capable of serving up ASP.Net 2.0 pages, while XSP is only 1.1 capable.

Packages

I work under Ubuntu, but the packages needed should be fairly easy to translate to other distros: (I already had mono installed – that step was trivial apt-get install mono or something similar. If you don’t have mono running, do that first.)

  • apache2
  • apache2-common
  • apache2-mpm-worker
  • apache2-utils
  • asp.net2-examples
  • mono-xsp2
  • mono-xsp2-base
  • mono-apache-server2
  • libapache2-mod-mono

Installing mod_mono will tell you to force-reload apache:

 $ sudo /etc/init.d/apache2 force-reload
 * Forcing reload of apache 2.0 web server... 
apache2: could not open document config file /etc/mono-server/mono-server-hosts.conf   [fail]

In Ubuntu, at least, the default mod_mono.conf is not setup for XSP2. If you see the failure above, then just pop open /etc/apache2/mods-enabled/mod_mono.conf and swap the commented lines to point to the correct mono-server directory. /etc/mono-server2/mono-server2-hosts.conf.

Configuration

(The Ubuntu documentation has the best description of this process I’ve found. Look here for their steps. I’ve included this section anyway because I still had difficulty connecting the problems I had with the solution posted on the Ubuntu page.)

I’ll assume you’ve been able to install and load the mod_mono module. From this point, we need to set the ASP handler, and define mono web applications. The first step is straightforward, at least if you’re familiar with Apache configuration:

 # Enable ASP in /usr/share/asp.net2-demos
 Alias /samples "/usr/share/asp.net2-demos"

      SetHandler mono

The second step is new to me – apparently mono needs an application root of some sort defined in addition to the handler configured above. Most pages suggest using the line:

 MonoApplications "/samples:/usr/share/asp.net2-demos/"

However, that caused ‘mod_mono’ to segfault continuously (the apache logs were horrible:

 Another mod-mono-server with the same arguments is already running
 Another mod-mono-server with the same arguments is already running
 [notice] child pid 7371 exit signal Segmentation fault (11)
 [notice] child pid 7372 exit signal Segmentation fault (11)
 [notice] child pid 7373 exit signal Segmentation fault (11)
 [notice] child pid 7374 exit signal Segmentation fault (11)
 ....
 # (about 1 / second)

It turns out that there is another way to accomplish the same thing. /etc/mono-server2/ can contain .webapp files which define essentially the same thing. The format for these files can be found in ‘man xsp2’:

       {appname}
       {virtual host for application}
       {port for application}
       {virtual directory in apache}
       {physical path to aspx files}
        is true by default --&gt;
       {true|false}

For the asp.net2 samples, I used this webapp config:

       samples
       localhost
       80
       /samples
       /usr/share/asp.net2-demos

After that, starting up apache worked without error and pointing a browser at http://localhost/samples popped up the Mono-project ASP.Net sample page.

Blog Migrations

I’ve moved Bitwise Evolution to yet another blog – this time I’ve moved from WordPress to MovableType. The motivating factor was that WordPress made it extremely difficult to post correctly formatted code along with other content. WordPress also doesn’t store a non-html version of each post, so you can’t easily edit old content without hacking auto-generated html.

Movable Type proved to be slightly more difficult to install, but it is much more configurable, and has a huge set of varied and useful plugins that actually do what they describe (gasp). Some of the things I’ve enabled include:

  • Markdown for wiki-like markup
  • SmartyPants for smart quotes.
  • GeSHi for syntax highlighting. (This required a couple additional plugins)
    • Transcode To hook MovableType up to GeSHi
    • MTMacro Needed to make the transcode syntax bearable.
    • MTRegex To add conditional behavior to the macros.
  • Acronym Used to enable mouse-over acronym expansion (so you can easily find out what DTD, XHTML, PCMCIA and etc. stand for, and it’s all automatic.)
  • LivePreview because none of the stuff above (except for Markdown and Smartypants) render correctly in the default preview view.

Here’s the macro used to turn ‘<pre lang=”java”> …. </pre>’ into the proper format for transcode:

 &lt;pre&gt;&lt;code&gt;transcode-language: java
       ...
 &lt;/code&gt;&lt;/pre&gt;

Macro:

 <pre>``` 

transcode-language:

“`

 <pre></pre>

Sooner or later I’ll probably take another look at blogging from emacs.

Polymorphic Generics in C#

Generics are great for adding some level of type safety to C#, but you may run into problems when using Generic classes with objects that aren’t of the exact Class or Interface indicated by the generic type template. Enter Generic Constraints.

Generic Constraints let you restrict the number of types a type variable can apply to. For example, Assume you have three classes:

  
class Aclass{/*...*/}  
class Bclass : Aclass{/*..*/}  
class Cclass : Aclass{/*..*/}  

If you have a method that takes a list of Aclass, you may want to be able to call it with a list of Bclass or Cclass as well. The naive approach doesn’t work however:

 
public void foo(List<Aclass> myList){ /* ... */ }

When you call foo with a List or List, the compiler will complain that the types don’t match. (unless you provide overrides of foo). Instead, make foo a generic method, and specify a constraint on the type:

 
public void foo<T>(List<T> myList) 
    where T : Aclass 
{ 
  /* now you can treat
      myList as a List<Aclass>*/ 
}

This link goes over generics in c# in detail:

MSDN on Generics

Vanity, Chapter 1

…empty pride inspired by an overweening conceit of one’s personal attainments or decorations; [1913 Webster]

I’m a sucker for pretty desktops and window managers. This weakness has yet to make me succumb to the lure of a full Gnome (or KDE) desktop, however. (Although I did play with the hack kludge known as XGL/compiz for a month or so.) I alternate between Enlightenment and FVWM, however, I do use a fair number of GTK apps. Eventually, I hope to create some gtk apps, (or wxWidgets, or etc… it all boils down to gtk showing up on screen though). Up ‘till now I’ve suffered through with the dreadfull defaults – in my opinion, of course – but that just changed with gtk-theme-switch2.

gtk-theme-switch2

It is awesome :) theme loading, previewing, setting, etc.. all without a hint of the gnome infrastructure to screw your keyboard layout, fonts, or power management.

In Agreement at Last..

Finally, windows did something I agree with:

If only it had caught itself during the OS install instead of just a measly user-space app…

Maybe this has been fixed in Vista… (…could that be the reason it’s still not out? I suppose I shouldn’t get my hopes up.)

At the Top of the Stack…

As is the case with many geek-endeavors, the things I’m currently working on have nothing to do with the goal I set out to achieve. At the moment I’m trying to find a way to convert xhtml into muse markup.

Why? Because your average Java programmer Just Doesn’t Get It when it comes to building a UI with Swing. Obviously.

Frustration with Swing UIs led to the idea that I should throw together a tutorial on building a GUI with swing while paying attention to separation of concerns re: layout, manipulation, threading and data. Blogging seemed like a prime medium to present such a tutorial, since each post could be tagged with a meaningful tag (eg: swing_tutorial) and I could post in sections.

This, of course, necessitates a blog (which I have, obviously) but which does emphatically not have any input mechanisms worth using for an extended period of time. The wordpress on-line editors are, in a word, pathetic. They may work great if you just want to stream your consciousness out to the world where it can pollute everyone’s Google results with poor spelling and meaningless chatter, but the editors available simply can’t handle source code.

It is possible to feed the blog pure html, which is (unfortunately) a step up from the rich editor, but which also isn’t going to cut it, because it could be so much better.

Seriously, look at Mediawiki, or twiki, or any of the millions of wiki engines out there. They ALL support better input mechanisms than wordpress. So why am I using wordpress at all? Because for everything else – user support, rss feeds, tags, data-base backed storage, plugins etc… it works great (as far as I know, come back in a week when I’ve resolved the editor issue to hear what else sucks. That should be enough time for me to find it.).

Solutions

I hope to get around this pain-in-the-ass that is the wordpress editor by using emacs to post content via worpress’s xml-rpc support. Therefore I need to find / enable or create support in emacs for the following things:

<li>wiki-like markup, with support for code tags that can be interpreted by the geshi plugin on wordpress.</li>
<li>a translator that converts wiki-like markup to xhtml, and back</li>
<li>xml-rpc support, and the ability to retrieve and submit blog posts.</li>
<li>Multiple Major Mode support for the wiki-like text mode, so that the aforementioned code tags use the correct font-lock mode</li>
<li>A preview capability, so the blog posts can be converted, fed to a w3-el buffer and viewed, then edited again prior to posting and publishing.</li>

Most of these are possible in some fashion or another, but currently I’m stuck on the “translator for xhtml to wiki-like markup”. Stay tuned for improvements, and news as I pop things off the stack.

Someday maybe I’ll get around to talking about java.

Blogging With Emacs + mt.el

I’ve wanted to use emacs as a blogging tool for a long time, but I’ve always run into issues. Today I ran across a blog post that describes an approach that works:

http://ektich.wordpress.com/2006/01/30/how-to-blog-from-emacs/

In addition to the instructions there, I had to pull down:

  • elib (with apt)
  • xml.el (from: http://www.astro.princeton.edu/~rhl/skyserver/xml.el)
  • Ping me with questions – more details soon to come as I play with mt.el, and possibly weblogger.el (which looks nicer, on first impressions.)

    (Not?) Ranting About .NET Collections…

    The .NET collections continually frustrate me with the obvious ommisions, even in .NET 2.0. Coming from a Java / Lisp background, I really expect two things out of a data structures API:

    • Lots of collections to choose from.
    • and; Easy manipulation of the structures you have available.

    .NET doesn’t fill either of these requirements very well. At least we have generics now (which, admittedly, is a step above what’s available in Lisp – with regard to types, anyway).

    Today I ran into (yet another) annoyance with .NET collections – sorting arrays elegantly. Given an array, you can sort it in accending order (according to the default comparer) with Arrays.Sort(..).

    // build & populate the array. 
    double[] values = source.ToArray(); 
    
    // destructively!! sorts values, returns void, of course.
    Array.Sort(values); 
    

    That’s nice. Now, sort it in reverse:

    // build & populate the array. 
    double[] values = source.ToArray(); 
    
    Array.Sort(values); 
    // reverse the array.. adds O(n) ops.
    Array.Reverse(values);
    

    or…

    // build & populate the array. 
    double[] values = source.ToArray(); 
    
    Array.Sort(values, Double.ReverseComparer); 
    

    Oh, wait. There is no ReverseComparer on Double.. actually, there’s no Comparer on Double either, but there is for most objects… so in general I could just wrap the comparer in a delegate or an anon class (to invert it) and use that.

    Wait again.. c# doesn’t have anon classes, and Sort doesn’t take a delegate under any incantation. So, we could do this:

    private class ReverseDoubleComparer : IComparer<double>{
        public int Compare(double x, double y){
            return y.CompareTo(x);
        }
    }
    
    /* 
    intervening code...
    */
    // build & populate the array. 
    double[] values = source.ToArray(); 
    
    Array.Sort(values, new ReverseDoubleComparer()); 
    

    That will work, but wow… for every type, I’ll need to create a new class, and I can only deal with classes that implement IComparable.CompareTo(..). Thankfully, I can use generics and some constructor overloading to deal with both situations:

    public class BackwardsComparer<T> : IComparer<T>{
       public BackwardsComparer(IComparer<T> c){
         _comparer = c;
       }
    
       public int Compare(T x, T y){
           return _comparer.Compare(y,x);
       }
    
       private IComparer<T> _comparer = null;
    }
    

    Now, we just need to do the following:

    string[] strs = strSource.ToArray();
    
    // sort strs in reverse alphabetical order:
    Array.Sort(strs, 
       new BackwardsComparer<T>(StringComparer.CurrentCulture));
    

    And there we have it – reverse array sorting without the additional cost of a Reverse() call, and avoiding the ugliness of case-specific classes floating around. (The complete listing for BackwardsComparer and test suite are here: BackwardsComparer.cs and BackwardsComparerTest.cs