My Photo

About Me

  • Hello you. I'm a 38-year old MSc student, studying Advanced Computer Science at Sussex University. I'm especially interested in Internet and mobile software, sensors and pervasive computing, user interfaces, and the process of developing great software.

    Before that I spent 11 years running Future Platforms, a software company I co-founded which makes lovely things for mobile phones, and which I sold in 2011.

    I read a lot, write here, and practice Aikido and airsoft. I live in Brighton, a seaside town on the south coast of the UK, with two cats and a clown.

AdSense

Stalk Me

  • Email me:
    twhume at gmail dot com
Blog powered by Typepad

« Guidelines for responsible reformatting | Main | Over the moon at Over the air »

March 22, 2008

Comments

Jag

Just a thought: but if we don't like exclusion whitelists then why not do away with the the wildcarded exclusion lists, after all they are just whitelists with longer "masks" aren't they? It seems to me that the art of transforming a web page so that it looks great on a mobile device that asks for it is at the heart of the issue. Not the kilobyte size of the page, nor the URL that it sits on, nor anything else. Any other "adjunct" rules or processes simply serve as "distractions" that could make the art go bad, or prove that the art is not well developed enough in the first place.

Basically: if a transformation process that claims to make pages look great on mobile phones cannot tell if the page being requested already looks great on a mobile phone, then surely it's not very good at transforming pages to look good on mobile phones? No amount of whitelisting, exclusion masking or page-weight decision-making is going to change that, surely?

The comments to this entry are closed.