Friday, December 22, 2006

Back and Live

After much frustration with Windows Live Writer (Beta) and Blogger (Beta) having communications issues, I am now back blogging with Windows Live Writer.

 

Gotta love trying to get two beta products communicating.

 

 

Technorati tags: ,

Monday, December 04, 2006

As Kermit would say ... "It's not easy being Green"

A fellow collegue of mine Darren Neimke has posted a rather interesting post on his blog, which has got me thinking. Now I have to admit, I do consider myself to be a "Greenie", however, far from having "jumped on the bandwagon, I have been a member of The Australian Greens now for well over 3 years, and in fact I can trace my pilgramage from the high slopes of comfortable numbness about environmental issues into the radical valley of the shadow of Greenie Activism to a course I took back at uni, almost 11 years ago. A philosophy course aimed at Engineers called Technology and Human Values in which we studied "systems engineering" approaches to the ways technologies are used and abused both here and in developing countries.

Anyway, my main reason for writing this blog article is to suggest to Darren, that I feel has has understated some of the issues slightly. I am ussuming that some of the understatements were intentional and for effect like " .... Or some ice melting in Antarctica." trivialising the disapearance of entire glaciers, however, I the one I really want to concerntrate on is the issue of research and the problems of people reading something and just believing it. From his post

Research is a skill that is being diminished by the one-click world of Google and the new media. People hear something and believe it to be true without even having the faintest clue of how they would ratify such assertions.

I think in this statement are a number of problems all rolled into one.
Firstly, yes we live in a society based on instant gratification. Google is a great tool, that can be used to justify any point of view you want to throw at it.
Secondly, I think it is quite difficult to "ratify" certain facts. The problem comes from the nature of the beast that is science. To really fully understand the latest scientific findings, requires you to pretty much be at the forefront of scientific reseach in that particular area, and this requires not only having a PHD, but being actively involved in hands on research. The problem is that even scientists at this level can't always seem to agree, and the way in which research projects get funding often requires researchers to make bold (sometimes even rediculous) statements to gain the attention of the various institutions that are willing to put up the money to fund this research. How is your average punter expected to be able to know what the truth is even if they have tracked down the information from people considered leaders in their field.

Not wanting to be all doom and gloom, I think that there are solutions emerging to these problems.

If I may use a buzz-word, with the web 2.0 paradigm, we are starting to understand the "wisdom of the crowd", in fact google has used this in its search algorithms for a while, but there are even more powerful tools that enable collaboration on an unprecidented scale, sites such as wikipedia are classic examples of this.

With respects to the scientific community, by its very nature, it is a dynamic exchange of ideas and theories. unlike other debates we encounter, such as religion, politics etc... there are not the same intitutionalised barriers to callenging the status quo. If a scientist or group of scientists have a theory, this will be critiqued by many other scientists who will then be able to repeat either the same experiment and validate the findings, or disprove them. Even if a scientist or group of scientist start to hold some political sway, and spread inaccurate theories for a time, eventually someone will be able to challenge the theory, and a better theory will prevail.

So where does that leave us with the environment debate. Well, my personal feeling is that the premise that us 6 billion humans are having a noticable impact on the world we live in has been kicking around for long enough now, and the vast majority of scientists who are working in the field seem to agree on some of the big ticket items that constitute climate change. What the results might be are admitedly speculation. Educated speculation, but speculation none the less. However, I think that we can no longer use the ignorance is bliss approach to keeping the status quo. We have to realise that systematic change is essential if we as a species want to continue to maintain the quality of life that we currently do. Even more so if we want to take any significant amount of non-human species into future. This then becomes a battle that sometimes needs to be fought on a political level, and sometimes on a personal level. So Darren, next time I'm bending your ear about how stupid it is to be logging our water catchments for wood chips in the middle of a drought, and you slip seemlessly into that glazed over look, at least you'll know where I'm comming from.

Sunday, December 03, 2006

Technorati and claiming your blog

I started playing around with Technorati the other day and decided to "Claim my blogs", just to see if anyone else was linking to me. I have two blogs, this one which is my geek blog, and another one that I reserve for my thoughts on politics/art/etc... anything that's not technology related. To my surprise I found quite a few people link to this blog, but to mt disappointment, found that I had absolutely NO links to my other blog. I guess that's no surprise really considering I have never been really active in promoting my other blog. So I thought I'd actually attempt to add some technorati goodness by putting in a shameless plug on my popular blog for my less popular blog.

Musings of a Morbid Mind.

No prizes for guessing that aliteration is one of my favourite literary devices.

Saturday, October 28, 2006

Development V's Production Environment

An interesting post on software environmental issues by Mitch Denny caught my attention today, not only because he sent a link around on our internal tech list, but because it's something that I think continually trips up a lot of developers (myself included). Mitch basically argues (and I agree) that development should be done in the same basic environment as the production system, he plays this off against infrastructure people who tend to want to keep a gap between production and development for safety reasons, I mean wo knows what wholes these crazy developers are going to poke in our top-notch security infrastructure.

Most of my experience has been in the shrink wrapped software industry ans as such, there is little point trying to match development and production environments, so in this case I suggest develop on the environment that makes the developers most productive, but then make sure you test, using whatever virtualisation technology you prefer, on ALL platforms in your supported platform matrix.

With enterprise development however, I would say that one should develop in an environment as close as is practicable to the production environment. It never ceases to amaze me the amount of problems that arise because of differences between the development and production environments. Have you ever heard a developer say "gee, it doesn't do that on my machine". The closer you make development environments to the production environment, the less you'll hear this anoying phrase.

The flip side of the coin is that developers need to understand fairly well the differences between their development environment and their production environment. One thing that has bitten me in the past, and a lot of other developers as well is the difference between IIS and the Visual Studio Development Web Server (The web server formerly known as Cassini). There are a number of differneces, and also a number of reasons that developers like to use it over IIS. One major difference which always seems to trip me up is that Cassini passes every requested file through the ASP.Net pipeline, as opposed to IIS which will serve a lot of files normally (ie *.css, *.jpg, *.gif, *.png, *.html, etc...), and only pass the ASP.Net specific files (*.aspx, *.asmx, *.ashx etc...) through to the ASP.Net pipeline. When you first start a project you don't really notice any difference until you think to yourself, "gee, it would be nice to add forms authentication now", and so you put in the standard <deny user="?" /> element and all of a sudden your login page has no graphics and doesn't look anything like how you designed it. This is because Cassini is sending all of you image and css files through the ASP.Net pipeline and they are getting blocked by the Security module because you are not authenticated... yet.

Tags: ASP.Net, IIS

Friday, October 13, 2006

Visual Studio 2005 misbehaving after Automatic updates

I have spent the morning struggling with Visual Studio. It has been crashing non-stop, but I have got to the bottom of the problem, and am happy to share with the rest of the world.

Visual Studio was happily starting and loading my solution, but whenever I clicked on the test menu and attempted to do something like open the TestView window, VS was crashing. If I tried to open a localtestrun.testrunconfig file it would crash with the following error "Could not open assembly System.Data version 2.0.0.0 .... The system cannot find specified path".

I won't bore you with the process I had to go through to find the resolution, but the problem turned out to be that this morning My machine installed some updates from Windows Automatic Updates. Most of these updates were succesful, but one of them, namely "Security Update for Microsoft .NET Framework, Version 2.0 (KB922770)" had failed with the following error "Error Code: 0x643". after doing a search on this particular error I found the following article http://support.microsoft.com/kb/923100. However, I did not need to follow the resolution outlined there, I simply went to the Windows Update site and manually installed the update.

I hope this helps someone out before they have to spend half a day like I did trying to track it down.

Tuesday, October 10, 2006

PopupControlExtender inside an EditTemplate

I found a problem today with the PopupControlExtender (Part of the ATLAS Control ToolKit), that occurs when it is embedded inside an EditTemplate for a GridView Control. I suspect the same problem would occur inside any template field, and potentially for other ATLAS Control Toolkit controls.

I wanted to edit a date field inside my EditTemplate, and so I set about implementing something not too dissimlar to the example for the PopupControlExtender on the Atlas Control Toolkit website. Firstly I prototyped it on a page all on its own, no problems everything worked as expected. Then I attempted to place the same code inside my EditTemplate field. At this point when I attempted to go into edit mode I recieved an error message "Assertion Failed unrecognized tag atlascontroltoolkit:popupControlBehavior". Obviously the necessary javascript libraries were not being downloaded when the page rendered. I assumed this was because of the dynamic way in which the EditTemplate renders.

To fix this problem, I simply placed an empty PopupControlExtender onto the design surface of my page. This ensures the necessary javascript files are downloaded, and when the EditTemplate is rendered, the ATLAS engine knows what you're talking about.

I also encountered a coulpe of bugs with the popup control being wrapped inside an update panel. I managed to fix these by simply downloading the latest CTP of the ATLAS Control Toolkit, (now called the "Microsoft ASP.Net Ajax Toolkit"... sorry guys, this is yet another example of how insisting on meaningful naming of products can really kill any excitement and mistique around it, I'm afraid it just doesn't have the same panache as "Ruby On Rails", and if you really want to speak to the buzzword concsious managers of today, you have to break away from boring naming conventions and inject some creativity into the process).

tags:

Monday, October 09, 2006

Reflections on Web Directions

I have to be the slowest blogger in the IT community. Over a week ago I attended the Web Directions conference. The conference was really good, and has inspired me in a few areas. Many other bloggers have had their say on the conference, but I may as well have mine, just for the sake of redundancy.

One of the common themes from both the Accessibility speakers (Gian Sampson-Wild and Derek Featherstone) and the User Experience speaker (Kelly Goto) was user testing, and not just asking the user what they think. Kelly Goto's quote was 'We listen to what the users "didn't say" and observed what they did'. I have also been challenged in the area of accessibility, I think the quote for me came from Derek Featherstone, and that is "The web is Accessible by default, we make it in-accessible". It has inspired me to go have a look at the way I develop, and the bad habits I've gotten into. The truth of the matter is that it is not really that much effort to get into good habbits that make web sites more accessible.

I really liked Jeremy Keith's AJAX sessions. The first session started out a little basic, but become more interesting in the last half. In the second session Jeremy discussed a technique he called "Hijax" which is aimed at ensuring accessibility, and support for down level browsers. I am looking forward to seeing how well I can apply his techniques using ATLAS. While on the topic of libraries, Jeremy did make one statement that I'm not sure I can agree with. He said that he didn't believe in using thrid party libraries for doing AJAX, firstly becasue AJAX wasn't that complicated and secondly because if something goes wrong in the library as a developer you'll need to be able to fix it. I almost agree with his first statement, but even still, I am a big fan in NOT re-inventing the wheel. If there's a library that has a great ranking control for example, and they've coded it so that it works across all browsers in your supported browser matrix, then there is a lot of testing and coding that you can potentially avoid. There are always bugs in any software, and AJAX APIs are no exception, the skill of a good developer is to be able to use an API in such a way that they can workaround any bugs in the underlying API, I've lost count how many times I've had to do this myself. Also, as a Winforms developer, I am extremely greatful that I do not have to write Win32 anymore, and I'm sure those of you who have written Win32 would agree with me. Having said that, his "hijax" mechanism of progressive enhancement is really cool, and my current goal is to go through all the ATLAS controls (the API I'm currently using) and see how I can apply this technique to them.

I went along to John Allsops talk on microformats to hopefully pickup anything I'd missed from the first time I heard it, and was inspired all over again.

One minor thing I think the organisors can improve on is something I saw at the Tech-ed conference, the "re-charge desks". The tech-ed organisors had desks with a series of power boards that people could plug their lap tops into between sessions. This would have been really good for me as my laptop is now a year and a half old, and my battery is showing its age, it wasn't even lasting 2 hours.

Worth mentioning:

  • Kelly Goto made reference to an example of an interesting User Experience project called Datelens. It is a callendaring visualisation which plugs into outlook (nothing to do with the web at all). I downloaded it and it's quite interesting, definately worth a look. Also it is written in .Net 1.1, which must have been quite an acheivement, and I think with WPF just around the corner, there will be a lot more of these types of user experience applications about.
  • The inaugural McFarlane proze for Excellence in standards based web design was awarded to http://www.museum.vic.gov.au/caughtandcoloured/. This site is really worth checking out. It is especially interesting to me because it is written in ASP.Net demonstrating that it is possible to write high quality standards compliant accessible web sites using ASP.Net.
  • - The last speaker of the conference was Mark Pesce (inventor of VRML, explained some of his concepts of social software, and described a project he worked on that attempted to aggregate your social behavior patterns and create social network models simply by using the ability of a blue tooth enabled phone. I thought there’d be some people at readify quite interested in this sort of stuff. http://relationalspace.org/

tags: wd06

Sunday, September 17, 2006

ATLAS Data Applications and Down-level support

I must confess that aslthough I am excited by the whole AJAX phenonenon, and the rich client experience that it provides for the majority of desktop browsers, I sometimes feel concerned about those that for whatever reason aren't able to experience the full beauty of a javascript enabled client. There are two main groups of people that are missing out

  1. Users with certain accessibility needs (often browsers and tools designed for people with disabilities have to run without the javascript support we usually take for granted) and
  2. PDA users (This is an increasing market, and to date only 1 browser has implemented any kind of javascript engine for devices, .

So I am always looking for ways to enhance these users experience.

My AJAX toolkit of choice (more through accident than intent) is ATLAS (Microsofts library). ATLAS contains a data enabled control called a ListView which is quite powerful. It works on a client side JavaScript based object model and the best way to get to know it is to read the data section of the ATAS documentation.

 

So the idea is, get a funky ATLAS based website working, and then attempt to create a version of that website as quickly as possible that can be used by people who can't (or won't) use javascript (similar to what google does with gmail).

 

I decided to make the sample as simple as possible without loosing anything along the way. I decided to create a Shopping List, that simply contains a quantity (double) and an item (string).

The bueaty of the ListView in ATLAS is that it can use the System.Component model similar to other databinding in ASP.Net 2.0. So the first step in creating a webservice that derives from Microsoft.Web.DataService (refer to the ATLAS documentation for more details).  I then create my ListView templates and hook it all up using XML-Script, and pretty it upwith any CSS I may care to add. The result can be seen here (Please forgive me I'm not a graphic artist). Once this is done, I needed to create a Non-ATLAS version of the same page. As my webservice already uses the System.Component model, it was extremely easly to add a GridView control, select an Object DataSource and wire it up, and away it goes. All in all it took me abou 10 minutes to create the NON-Atlas page. It would have taken me another 2 minutes to enable things like edit/delete, but for the moment I wanted to keep it the same as the ATLAS version, this may be the subject of a future post. The only significant difference between the two pages is the presence of a select link on the NON-ATLAS enabled version whereas the ATLAS version manages the select operation simply by handling the mouse click.

 

Once you have these two pages it is simply a matter of choosing your favourite way to direct users with different broswser capabilities to the correct site.

Sunday, August 20, 2006

ajax gotcha for new players

Being fairly new to the "Brave New World" of ajax, this week I fell into a fairly common trap that gets new players. I studiously looked around for the best practice implementation of making an http request using javascript and came up with the following javascript function which allows the user to make the http request, specify a callback function, and even goes the extra mile of implimenting timeouts (I guess the only other thing I really should do is implement an OnErrorCallback).


function MakeRequest(url, timeout,
onCompleteCallback, onTimeoutCallback)
{
var completeCallback = onCompleteCallback;
var timeoutCallback = onTimeoutCallback;

var xmlHttp=createXMLHttpRequest();

if (xmlHttp)
{
xmlHttp.onreadystatechange = function()
{
if (xmlHttp.readyState == 4)
{
window.clearTimeout(timeoutId);
if (xmlHttp.status == 200 ||
xmlHttp.status == 304)
{
completeCallback(
xmlHttp.responseText);
}
}
};

xmlHttp.open("GET",url,true);

var timeoutId =
window.setTimeout(function()
{
if (callInProgress(xmlHttp))
{
xmlHttp.abort();
timeoutCallback('timeout');
}
} , timeout);

xmlHttp.send(null);
}
}

function callInProgress(xmlHttp)
{
switch ( xmlHttp.readyState )
{
case 1, 2, 3:
return true;
break;

// Case 4 and 0
default:
return false;
break;
}
}


function createXMLHttpRequest() {
if(window.XMLHttpRequest) {
try {
xmlHttpRequest = new XMLHttpRequest();
} catch(e) { return null; }
} else if(window.ActiveXObject) {
try {
xmlHttpRequest =
new ActiveXObject("Msxml2.XMLHTTP");
} catch(e) {
try {
xmlHttpRequest =
new ActiveXObject("Microsoft.XMLHTTP");
} catch (e) { return null; }
}
} else return null;
return xmlHttpRequest;
}


I then wanted to call a .Net HttpHandler, and lets just say for arguments sake that the htp handler looks like this


public class Handler : IHttpHandler
{

public void ProcessRequest (HttpContext context)
{
System.Threading.Thread.Sleep(6000);
context.Response.ContentType = "text/plain";
context.Response.Write(Guid.NewGuid().ToString());
}

public bool IsReusable
{
get
{
return false;
}
}
}


so the call from javascript would simply look something like this



MakeRequest(
'Handler.ashx?cookie=abcd&another=xyz',
20000, onRequestComplete,
onRequestTimeout);

function onRequestTimeout(response)
{
alert('timeout');
}

function onRequestComplete(response)
{
alert(response);
}



The problem as I found out through good old wikipedia relates to the internet explorers caching of the "GET" http command as described in this article on HMLHttpRequest.
As discussed in the article, there are a number of ways around this, switch to using POST or ensure that you switch the caching off in the http headers. I chose the former.

Sunday, August 13, 2006

Automatic updates + Automatic Reboots

Is it only me or does anyone else find this really frustrating.

message

I got this in the middle of reading my email this morning, and I really didn't want to reboot in 5 minutes. I would have quite hapoily rebooted in half an hour or so when I'd finished what I was doing. The fact that the restart later button was there but disabled just added to the insult.

I think from a layer 8 perspective, Microsoft need to be really careful when forcing reboots. If this happens too frequently, it may be frustrating enough that people actually turn Automatic Updates off, making their systems progressively less secure and vulnerable.

Saturday, August 12, 2006

Web Directions

On thursday night I went along to a Web Directions preview night where 2 of the speakers from the upcomming Web Directions conference Ben Barren from gnoos and John Allsop from westciv were giving previews of their presentatoins at the conference. It was a good night, I particularly enjoyed Johns talk on . This is a topic that I have been interested in for a while now, and may have more to say about it later.

Unlike other events I attend, where I know a fair few people, this event I only found one person I'd met before was Nigel Watson, an Architect Advisor from Microsoft. So it was good to make new contacts and expand my circle beyond the sphere of the Melbourne .Net developer and SQL Server sphere. I got to meet Cameron Adams whose book "The Javascript Anthology" I am currently half way through, and also met the people from westciv who write the best damn web standards online training courses in existance.

When I first heard about the web directions conference, I thought that as much as I'd love to go along, I probably wouldn't be able to attend given that I was already attending Tech Ed at the end of August, and I'd be pushing it a little if I asked my employer to pay for another conference and let me have the time off so close after Tech Ed. As the saying goes, "the best layed plans of mice and men gang aft aglay", there was a lucky door prize drawn at the end of the night, which was a free ticket to attend the Web Directions conference. Guess who won........ So now all I have to do is convince my employer to let me have a couple of days off to attend..... that shouldn't be too hard.

Thursday, July 13, 2006

Stepping through ATLAS Javascript files

I'm trying to debug an application I'm writing using ATLAS, and I'm wnating to know why my attempts to make browser client magic happen are resulting in java script errors. Fortunately Visual Studio provides a level of debugging for java script files which means that if you're in debug mode when a java script error occurs, you'll automatically be taken to the point in the code where the problem occured and given some useful things like a stack trace and a variable watch window. You can also set break points in your own code, however, if you have registered the scriptsw using an Atlas Script manager as is advised when writing ASP.Net applications


<atlas:ScriptManager ID="ScriptManager1" runat="server" >
<Scripts>
<atlas:ScriptReference ScriptName="AtlasUIDragDrop" />
</Scripts>
</atlas:ScriptManager>


you will not be able step into any of the ATLAS libraries you happen to be using as the debugger complains that the source files are not available. This is because of the way the ATLAS scrip manager embeds the java script.

today I found myself thinking that it would be handy to be able to step into these files to find out what's going on prior to the error.

To solve the problem

  1. Copy the necessary java script files from the install location into your project and mark them as "Embedded Resource". In my case Atlas.js and AtlasUIDragDrop.js

  2. Comment out or remove the Atlas Script Manager component

  3. Add the script files as you would add any normal java script file (ie with the <script> tag)



Once you have done this you should be able to step into the ATLAS java script files to your hearts content, but be warned you might find it a little scarey in there.

The only problem with this is that it won't work for Server Side Atlas controls, such as UpdatePanel or Extender controls, but if you're not using them, or if you can comment them out while you debug stuff, then you're set.

Thursday, July 06, 2006

Google Maps and sustainable transport

I have found a great site that is a mashup of google maps. http://www.bikely.com. The idea is you can search the user entered database for the best way to get between various points by bike. The website also allows you to sign in and create your own bike route so you can share your favourite rides with others. You can even put comments at each point that appear as pop-ups on the map. I'll have to start riding my bike more so that I can contribute.

Tuesday, June 27, 2006

Browser Version hell

Those familiar with the concept of DLL hell and have experienced it first hand often point to web based applications as the savior for such deployment issues, and to an extent they're correct. However, as the push for a richer client experience from web applications grows, and as the technologies that enable it mature, DLL hell is being replaced by Browser version hell. We are all aware that Internet Explorer and the Mozilla based browsers (firefox) have their own peculiarities, but even beyond that are the versions of these browsers. In fact, IE 6 has been particularly slammed for some of its, shall we say "non-standard" behaviour. As a result of this I have been a big fan of firefox that is considerably more standards compliant. This has lead to web designers targeting firefox first, and then tweaking sites as needed for other browsers.

For the past 6 months or so I have been neglecting firefox a little in order to try out the all new IE 7 in its various beta forms. IE 7 is supposed to be far more standards compliant, amongst adding other features users of firefox and other browsers take for granted. So as a result, I haven't updated firefox for a while, and using firefox version 1.0.7 I was testing out a new Atlas based control called the collapsible panel control, and the bad news is, It doesn't work. The area that should display when the panel is expanded is displayed completely blank. Upgrading to version 1.5.04 fixes the problem completely. Now admittedly the Atlas Control Toolkit is very much in a pre-release state, but it does raise a very pertinent question.

Where should web developers draw the line and say I'm not going to support versions of browser X before version Y, and how should web sites indicate that a user should upgrade their browser? Also, how many different versions of browser X should we be realistically expected to test under? This I believe is different to the idea of degrading nicely (ie what should happen if a browser doesn't support Java Script or only supports a limited subset of Java Script). All I can say is that I hope the team at Microsoft fix the problem with the control for firefox Version 1.0.7.

Monday, June 26, 2006

Web Application Projects and SQL Express Data Sources

I have been struggling today with some SQL Express related data issues. I have been trying to use a combination of Web Application Projects and ATLAS, as I am convinced that Microsoft have completely stuffed up with their "Web Site" style projects, I really don't want to use them. So to do this you simply create a new Web Application Project, and then copy all the necessary settings from the sample ATLAS web.config file into your own web.config file, and ALL is fine... that is until you want to do something with SQL Express. Regardless of ATLAS, the standard approach is to add a new item to the project and select a SQL Database. This uses SQL Express in a mode called "User Instances". This is designed to allow you to treat the databases primary file (mydatabase.mdf) as though it is just that, a file (impressive if you know what's really going on under the hood and understand how SQL Server normally works), and this is how all the Microsoft demos go, except that they use Web Sites instead of Web Application Projects. The theory is then to add tables, and create a DataSet off these tables. So I thought it would just be a simple matter of creating a Web Application Project, and then following exactly the same logic. WRONG!!!!!!

My first efforts yielded the following error

Description: An error occurred during the compilation of a resource required to service this request. Please review the following specific error details and modify your source code appropriately.

Compiler Error Message: CS0122: MyWebApplicationProject.Properties.Settings' is inaccessible due to its protection level

This is because the default way of storing the connection string is to use the Properties folder and create an entry in the Setteings.Settings file. In this case there is a class called MyWebApplication.Properties.Settings. This has a protection level of "internal" meaning that only code from within the same assembly can refer to it. The code attempting to access it is NOT in the same assembly so it fails compilation.... DOH.

Next step was to attempt to rework it so that it got the connection string from the web.config file. After correcting all the places where I found a reference to the connection string, I received the following error.

Exception Details: System.Data.SqlClient.SqlException: An attempt to attach an auto-named database for file C:\Projects\AtlasApp\AtlasApp\App_Data\App_Data\Tasks.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.

Better, at least this time it's compiling. After spending entirely too much time attempting to figure out why SQL Server User Instances were failing, I eventually re-read the error message, and noticed the path it was attempting to access. C:\Projects\AtlasApp\AtlasApp\App_Data\App_Data\Tasks.mdf

Notice the two App_Data directories. Yep, so simply exclude the App_Data part of the path in the connection string and away it goes, everything now works.

The reason I write this article is because I found very little documentation on the problems. Problems which I thought would be occurring quite frequently if people are using Web Application Projects, but alas the only mentions I have found are a recognition by Microsoft that there are problems, and a vague promises of a white paper on the issue from Microsoft.

In conclusion, I've had quite a frustrating day, and I think Microsoft still have some work to do on Web Application Projects.

Friday, May 26, 2006

using a custom database for persisted SQL Server ASP.Net session state

I am in the process of upgrading a major project from ASP.Net 1.1 to ASP.Net 2.0, and my ideal is to have 2 versions of the web application able to be run so that I can compare what's going on in the .Net 2.0 version to what should go on according to the .Net 1.1 version. This doesn't seem too difficult, just use a different web folder right? Well, this would be the case except that we're using SQL Server persisted session state and although they use a different schema between .Net 1.1 and .Net 2.0, the default database name is the same "ASPState". After a bit of fiddling around I eventually solved the problem.

Step 1 run the following command line from the appropriate .Net 2.0 directory replacing ASPState_2_0 with the database name you wish to use

aspnet_regsql -S localhost -E -ssa
dd -sstype c -d ASPState_2_0


This creates the session state database with the appropriate schema in the specified database name instead of the default.

Step 2 modify the following line in the web.config file accordingly

<sessionState mode="SQLServer" allowCustomSqlDatabase="true" stateConnectionString="tcpip=127.0.0.1:42424" sqlConnectionString="data source=localhost;Initial Catalog=ASPState_2_0;Trusted_Connection=Yes;" cookieless="false" timeout="15" />

Step 3 Manually give the ASPNet user (or whatever user you are running the ASP.Net web application as the appropriate priviledges on the database tables in the newly created session state databse.

MEDC 2006

Just got back from the MEDC 2006 conference where I was lucky enough to recieve an I-Mate SP5 simply for attending.
sp5

The sessions were all very interesting, my personal favourite was probably the session on performance in the compact framework.

The other interesting thing I learnt is that people are actually reading my blog. Well, if that's the case I might have to buck up my ideas and actually put some half decent content on.

Wednesday, May 24, 2006

A new and exciting project

This week I started a new and very exciting project. I can't really give away all the details as the project is still in the planning phase, but what I can say is that it will be a Web 2.0 project. Now I know Web 2.0 can sound like a bit of a buzz word and as such looses meaning, but I think this article by O'Reily explains Web 2.0 in a way that cuts through most of the marketing hype.

The Web 2.0 concepts that we are thinking of using in this project involve the idea of Mash Up's of a few different services, the idea of collaboration and collective intelligence, technologies such as ATLAS (AJAX for ASP.Net) and the concept of the quick release cycle.

It is interesting that I have ended up on this project. I must be the least experienced ASP.Net developer at Readify, and the last time I did anything commercially for the WEB (almost 2 years ago now), was firmly rooted in Web 1.0 mindset. Since then I have been firmly stuck on the Rich Client side of the divide writing Pocket PC and Desktop style applications. The team lead on this project has suggested that this is probably a good thing as we can take my experience designing and working with rich UI's and not be limited in my thinking of what is capable on the web. We are all getting really excited about the potential of this project, and I will give more details as the project progresses.

Monday, May 08, 2006

To Train or Be Trained... That is the question

Over the past 15 months I have been consulting to QSR and it has been a fantastic experience. The only issue has been that since I've been there I have worked exclusively on a winforms .Net 2.0 application. This has meant that my ASP.Net 2.0 skills were fairly rudamentry, and weren't improving beyond the tiny bit of time I've been able to allocate to professional development on the bus on the way to work.
So I decided to do the Readify Professional .Net which has been recently re-vamped for .Net 2.0. So I rang Chris Hewitt who was instructing the course to tell him of my intention. Chris's paraphrased reply was something like ... "You've been using .Net 2.0 for over a year now, you know more about .Net 2.0 than I do, you can teach some of the course".

So this is how I ended up teaching three modules of a course I was originally intending to sit. The 3 modules I taught were Generics (one of my favourite .Net 2.0 features), Click Once (on Chris's suggestion) and refactoring. It is my goal to to be trained up to a point where I can teach the course myself, and it was good sitting under Chris who has extensive teaching experience. I have always enjoyed teaching ever since I supported myself through uni by tutoring high school maths and physics, and I know it is one of the best ways to ensure your own grounding of the material. The thing that scares me the most is questions. I know I shouldn't be scared, we aren't expected to know everything, and in the rare cases when Chris didn't know the answers, it proved to be a real journey of discovery for allpf us as we attempted to find the answers to these questions.

Tuesday, May 02, 2006

Click Once Deployment Gotcha

In attempting to prepare for a module I am teaching tomorrow, I am putting Click Once through its paces, and I'm finding there are some strange issues. I have been struggling with an unusual error for half the night.

I am Publishing my files to a local site, then openning up internet explorer, navigating to the site and then attempting to Install the application. The application contains a 4 files that require downloading, and it uses SQL Server Express. The Install starts out fine, it then decides it needs to momentarily open a browser window. Now my default browser is Fire Fox, and it decides to open this. At this point firefox decides to prompt me for a username and password. Why I don't know, but seconds after typing in my credentials, it brings up an error saying that it cannot download the application as there are missing files. The install fails.

I eventually figured out that if I change my default browser back to be internet explorer, the problem goes away. Hope this helps someone.

Monday, April 24, 2006

SQL Server Express with Advanced Services

As an avid user of SQL Server Express, it would be remiss of me not to herald the release of the much anticipated SQL Server Express Edition with Advanced Services with a blog post. This adds SQL Server Reporting Services and SQL Server Full text search as well as its very own version of management studio into the mix which now gives it a distinct advantage over its predecessor MSDE.

IMHO the only thing that is now missing from SQL Express is SSIS. I think that adding SSIS to SQL Express would allow people to enter more data quicker and would therefore come up against the size and performance related restrictions of SQL Server Express, and would cause people to upgrade to other SKU’s of SQL Server sooner.

Tuesday, April 11, 2006

SQL Server 2005 installation problems on various CPU's

As I have become the unlikely SQL Server 2005 "expert" at the company I am currently consulting to, I have been investigating a wide number of installation issue with SQL Server Express. There are a few main categories these problems fall into.

1. Installation of beta versions of SQL Server 2005 or the .Net Framework 2.0 (solution uninstall all beta versions of SQL Server 2005 or .Net 2.0, and if you've installed beta versions of Visual Studio 2005, see http://msdn.microsoft.com/vstudio/support/uninstall/default.aspx)
2. The CPU is NOT fully Pentium III compatible
3. Various issues ranging from disk space to memory to OS service pack level etc...

In this post I want to deal with problem #2, The CPU compatibility problem.
There seem to be a number of CPU's that claim some kind of Pentium III compatibility, and as the hardware and software requirements for SQL Server 2005 state, Pentium III is the minimum hardware requirement.

However, for whatever reason, the SQL Server installer does not check CPU compatibility too thoroughly as we have seen a in a number of cases that the installer will issue a Hardware warning, but happily allow the user to continue to attempt to install SQL Server 2005 but will in the very final stages of the install refuse to start the SQL Server service. I'm not sure of all of the reasons why this error potentially occurs, but I know 1 definite problem is that SQL Server insists on the cache prefethching instructions of the Pentium III CPU. This is missing in a fair amount of CPU's that claim compatibility with Pentium III's. I have asked the guys at Microsoft about the minimum hardware warning, and they confess it is a problem, and are reviewing whether they are going to be more specific in the future, however, I think there needs to be some kind of list of all the different CPU's that one might reasonably expect to be able to install SQL Server 2005 on, but don't seem to have the ability to. So here it is, or at least the results of my research, if anyone else has seen similar problems, please leave a comment with the CPU specs and any links to discussions about the problem on various forums.

Via Eden - http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=304355&SiteID=1

500 MHz AMD Family 5, Model 8, Stepping 12 processor (AMD-K6-2) - http://www.sqljunkies.com/Forums/ShowPost.aspx?PostID=167¬ification_id=421614&message_id=421614

VIA Ezra 800Mhz processor - http://www.sqljunkies.com/Forums/ShowPost.aspx?PostID=167¬ification_id=421614&message_id=421614

Microsoft Virtual PC For Mac OS 7.0.2 (This emulates a Pentium II) http://www.microsoft.com/communities/newsgroups/en-us/default.aspx?dg=microsoft.public.mac.virtualpc&tid=90715b2e-77ba-4f8a-a189-daee29467ff0&cat=en_US_595a5881-be4a-4db4-97b0-5f3a78602e23&lang=en&cr=US&sloc=en-us&m=1&p=1

Transmeta - http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=17395&SiteID=1 and http://lab.msdn.microsoft.com/ProductFeedback/viewfeedback.aspx?feedbackid=cee0915a-6039-4ca6-aa0c-1b76285d4ac3

Saturday, March 25, 2006

Teddy Bear code reviews

I have been a big advocate of peer code reviews for some time now (see my previous post on peer code reviews), and I feel very lucky to have been a consultant at a company for the past year and a bit that firmly believe in peer code reviews. As my time comes close to winding up at this company, I am faced with the prospect of going to consult for a company that does not have the same attitude to peer code reviews, and I've been wondering what my approach should be.

My first thought is to attempt to convince the managers at whatever company I end up at that peer reviews are a good idea, and explain the development to the overall development life cycle, and show them the graphs that explain the cost of fixing bugs at various points throughout the software development life cycle, this in my opinion should be enough to convince any decent manager that an extra 2 hours in development is better than an the 20 hours of test/dev/management time if a bug makes it into system test, or worse if it makes it into the wild and is reported by a user. However, being experienced with the way management works, and knowing that not all managers listen to common sense or think that developers shouldn't write bugs in the first place, I've been trying to think how I will attempt to maintain my code quality in such an environment.

I'm not sure how well known the concept of teddy-bearing has become, but the urban myth starts with a development team that had a team lead who was as a developer pretty useless. The developers however, found he was great for one thing. If you were stuck on a programming problem, you could call him over and ask him to have a look. Because he was so useless, the developers would have to step him through the bits of the code that were causing the problem. In the process of explaining the problem at a level that the team lead could understand, the developer would quite often see his/her mistake and be able to fix it right there in front of the team lead which made the team lead feel as though he was immensely important. When this team lead eventually moved on to bigger better and greater things, the team was at a loss to know how they would fix these hard to find bugs, until one day, one of the developers who was stumped on a problem picked up a teddy bear that was lying around the office and sat it on his desk started explaining the problem to the teddy bear. They found that the teddy bear was every bit as effective as the team lead, and had the added advantage of having better people skills. The teddy bear become an office legend and was invited to all the office parties where legend has it he told far better jokes than the old team lead.

From this the concept of teddy-bearing was born, and is used whenever you encounter the phenomenon of explaining your code to someone in order to find a problem, and find the problem yourself with no real constructive input from the person you asked to help.

So what does that have to do with code reviews? Well, I'm a big fan of the whole subversive style of process improvement. If management won't buy source control, install subversion and maintain your own repository, if the team doesn't believe in unit testing download nunit and write your own, if there is no nightly build process, download nant and do it yourself. So if there are no code reviews, buy yourself a teddy bear, give it a name, and sit it on your desk. Fellow colleagues will probably think you are a little unusual, but you're a geek, people generally think you're more than a little unusual anyway.

When you get to a stage where you are completing a chunk of work, sit the teddy-bear beside you and talk it through your code. A lot of the psychology of a code review will be there, and in reality you will have one of the toughest reviewers you can possibly have..... Yourself.

Wednesday, March 01, 2006

QSR Releases NVivo7

The company I have been consulting to for the past year and a bit (QSR) have today released the product we have been working on, their flagship product NVivo 7.

I'd also like to add that it was released on time, and without stressing out the entire team for months leading up to the release. Just goes to show what good project management and appropriate resource allocation can do.

Friday, February 24, 2006

Testing for developers from the Braider Tester

A blog that I read from time to time has some great resources for testers. The blog is called The Braidy Tester. You may ask why I a developer is reading such a blog, well, I am interested in the entire SDLC, and always like to know what's going on when I throw the next build over the fence to the testers, and also, I don't think it hurts developers to be aware of how to be better testers themselves for testing their own code changes. A fact which The Braidy tester appears to agree with given a recent article on Testing For Developers

Thursday, January 26, 2006

The spin detection machine

Interesting concept, a piece of software that detects the amount of spin in a speech. The model is based on analysing the frequency of various kinds of words in a speech and professes to be able to detect levels of deception and insincerity in English text. I must confess, I'm more than a little skeptical. I just think there are so many variables, ie the speakers/speech writers proficiency with the English language, different styles of speech writing, different topics, different audience. All of these things change the way we use language and shape what we say.

I guess the ultimate test will come when they decide to commercialise the software and produce a website dedicated to selling it..... and then run the software over the site. I guess this could be seen as an extension of the halting problem.

Friday, January 20, 2006

Arguments for continuous integration

The company I’m currently consulting to are pretty good when it comes to their internal development systems, although one of the main areas I can see for improvement is in continuous integration. They currently have a full build process is manually initiated, and works pretty well. I really think they need to go to the next step, that of continuous integration.

A couple of days ago I was checking in some very subtle but serious changes to the way we accessed SQL Server (we switched to use User Instances so that we could run as an LUA user). We were all a bit nervous about this change because although it was a relatively minor change, it was a significant unknown, in fact I was the only one who had any idea of how user instances worked. So as part of the code review, the reviewer asked me to run all of the unit tests just to be sure we hadn’t inadvertently broken anything, and sure enough a whole heap of them just curled up and died a horrible death. Being the insecure person that I am I immediately rushed to the conclusion that it was my changes and started questioning my own understanding of user instances. When running the application, I was able to reproduce the error under certain conditions on my machine, but no-one else was able to reproduce the error on their machine. The next day I spent the best part of the morning looking at the problem and eventually discovered that firstly the reason I could reproduce it on my machine was because I was using an obscure (non-default) option in the application, and secondly that the bug occurred with or without my changes from the night before. Armed with this information I then went to the team lead who then spent the next half of the day going back through labels in our source control attempting to find the release that this bug first occurred in. He eventually found it to be due to a check-in about 4 or 5 days prior. This indicated 2 things, firstly people had not been running even the effected unit tests for their check-ins, and secondly if we had continuous integration, and unit tests were run automatically at the point of check-in, we would not have wasted a day and a half of development time tracking down the origin of the bug. I told the development manager that this incident re-affirmed my faith as a unit-test evangelist.

SQL Sever & file permissions

In the course of trying to track down a problem running the application i'm involved in developing at the moment, I stumbled across the following article describing what SQL Server does with file permissions when it attaches to a database file.
The article states that for security reasons it changes the file permissions so that if a file has been placed in an insecure folder, then it attempts to protect it from someone without the privileges from moving or deleting it.
It seems to me to be a little mis-guided. Firstly, I would assume that it's the responsibility of the DBA, to place all database files in an appropriate place with appropriate permissions to the various users that require access. The secure by default paradigm is all well & good, but there is no way to turn this behavior off. Secondly, it doesn't really achieve anything. If a DBA has been dumb enough to place a vitally important database file in a folder that anyone has full control over, then if SQL Server is not attached, the user can easily cut & paste the folder wherever they want which will have the same effect as moving the original file.

Maybe i'm not seeing something here, I can't really see the need for this.

Saturday, January 14, 2006

New toy

Well, I'm now the proud new owner of an O2 XDA Atom.

I have been hanging out since my XDA II started playing up, but I wanted to hold out for a Windows Mobile 2005 device.

So far, mostly I have been impressed. I love the ability to switch between lanscape and portrait screen modes, I love the in-built wi-fi, I adore the tiny size (in comparisan to the XDAII) and it seems there have been some improvements to the predictive text which makes it easier to enter data into, but there are a few things I haven't been so impressed with.

Maybe it's just buggy early release software, but I have had three occasions so far where the phone has stopped responding to the on/off button. I have also not had any luck what so ever with the devices alarm. I have had almost a week now where it has not woken me up even though I have set it. My partner says I should read the manual, but I'm sorry, it goes against every rule of usability and good design if I have to read a manual in order to set the freakin alarm clock. I'm happy to read the manual for other more complex tasks, but not to set the alarm clock.

Anyway, I will persevere and hope a firmware update is close. I'm also hoping to get back into some Pocket PC Development, I'm really keen to put sql mobile and the new version of the .Net Compact Framework through their paces.