Saturday, December 11, 2010

Error 80020006 on WP7 InvokeScript on the WebBrowser

^&$%...

If you hit this error...

Check you've set <phone:WebBrowser IsScriptEnabled="True" >

Doh! Another hour of my life gone!

Thursday, December 09, 2010

Switching to Azure SDK 1.3

I've had "quite a few" issues switching to the new SDK for Azure.

Most of these have been in big solutions where projects seem intent on continuing to reference the old 1.0.0.0 StorageClient - hopefully that's now all done (just that little flag to change which stops them referencing "Specific Version")

One configuration change concerned diagnostics - Neil McKenzie's post helps with this http://convective.wordpress.com/2010/12/01/configuration-changes-to-windows-azure-diagnostics-in-azure-sdk-v1-3 - but really the easiest way to update your diagnostics is to use the "properties wizard" for each role within the Azure cloud project in your solution.

Overall I quite like most of the 1.3 changes - the problems I've hit as a result haven't been fun, but I understand why the changes are needed... Forwards!

Tuesday, December 07, 2010

If you're fiddler POST posts aren't correctly getting the form variables...

... make sure you've set the header:
Content-type: application/x-www-form-urlencoded
Couldn't see why my POST wasn't working - it was just a missing header - doh!

Wednesday, December 01, 2010

My "presentation" to the Windows Phone User Group

I did this "presentation" using Notepad - it was all a bit last minute....

The theme of it was about how I've just built my first WP7 app - RunSat - and the lessons learned

I think (hope?) the presentation was better when it had a demo attached!




Timeline
- last November Samsung gave me a Omnia II.
- then I bought an iPhone
- then Samsung gave me a Bada phone
- 12 days ago I wrote my first Silverlight/WPF app
- then last Thursday Paul Foster lent me a WP7 phone







1. Wp7 development is a pleasure compared to:
WP7 dev was a pleasure
- PC dev, 
- WinCE (MFC, then CF2) on Wm6.5, 
- Monotouch, 
- Obj-C xCode, 
- Bada C++
- a teeny bit Eclipse/Android






2. Installation of the tools was super quick 
Zune for PC installation was a bit slow...










3. Device-based dev is super fast, reliable and painless  
- debugging support is great 
- really miss wifi debugging from MonoTouch








4. Emulator-based dev is a bit slower! 
And lack of GPS in emulator is a pain
Some workarounds available - e.g. reactive










5. Silverlight/XAML 
as a non-user 10 days ago, I've found this very easy to pick up 
- but I'm a world away from mastering XAML.
- efficiency doesn't seem to matter ?

- Blend makes blood come out of my ears





6. Tombstoning is actually pretty easy

- unless your data doesn't really fit 

- mine didn't









7. When you Google (Bing?) around for sample code, it's definitely a bit confusing working out what WPF/SL3/SL4 code/xaml will work on WP7

- but there is lots of good blogging available :)








8. Bing maps integration was super quick and super nice :)

- didn't support MVVM perfectly









9. Fighting the Appstore guidelines - I hate the app stores

- app store approval gotchas...
--- pivot control abuse?
--- GPS - request access yourself and supply privacy
--- run idle - request access yourself
--- help and support
--- dark/light theming



10. Cross platform 
- this app came from WM6.5
- it went to Monotouch
- it went from Monotouch to WP7
- next up is MonoDroid (preview 7 already)







Details of cross platform

- GPS integration - just effected one module
- File IO - main IO just worked
- BUT open/close/exists/delete code needed changing for IsolatedStorage
- DateTime (clock) source information
- XML serialisation - XMLDocument not available on WP7 - had to port code across to XDocument - not a biggie - also used JSON.Net
- Web services calls - legacy ASMX didn't work... had to hand craft my XML :( - Would probably have been quicker to change the server api's to newer WCF services.









Cross platform

- big story is how much I haven't rewritten...

- UI - lots of new code to write

---- tables/lists
---- WP7 Pivot/Panorama controls against tables and paged scrollviews
---- notifications

- MVVM pattern made this very manageable (UIDialog on MonoTouch also made my life much simpler) 
- will be much easier again in 2 days time?

- app lifecycle management - still working this out - tombstoning is quite straight forward but requires thought...

- tests... nunit runs on all platforms

Tuesday, November 30, 2010

Marketplace advice for WP7 Windows Phone 7

This post is priceless - http://blogs.msdn.com/b/wsteele/archive/2010/10/15/important-info-for-wp7-application-developers.aspx

1) READ the docs!!!

2) Know your iconography.

3) Support Information – Test Case 5.6.

4) Toast Notification – Test Case 6.2

5) Applications Running Under a Locked Screen – Test Case 6.3

6) Back Button – Test Case 5.2.4
7) Themes – Test Case 5.1.1
8) Languages.
9) Failures upon Upload to the Marketplace
10) Windows Phone Developer Tools.

    Sunday, November 28, 2010

    Razor MVC3, a file upload form

    @using (Html.BeginForm("CreateStepUpload", "Pivot", FormMethod.Post, new { enctype = "multipart/form-data" }))
    {
        @Html.ValidationSummary(true)
        <fieldset>
            <legend>Fields</legend>

            <div class="editor-label">
                File:
            </div>
            <div class="editor-field">
                <input type="file" name="theFile" id="theFile" />
            </div>
            <p>
                <input type="submit" value="Save" />
            </p>
        </fieldset>
    }

    Razor, MVC3 - a readonly textbox

    Code is:

    @Html.TextBoxFor(model => model.Fields[i].TheField, new { @readonly = true })

    Saturday, November 27, 2010

    Ssssshhhh - don't tell anyone but I'm enjoying web dev with ASP.Net MVC3 and Razor

    Just very impressed at the moment that my ViewModels are flying in/out of my Controllers with quite a lot of ease...

    Also impressed with how clean Razor feels compared to old ASP.Net

    Also impressed with how easy it is to extend the Expression<> based HtmlHelper code.

    ...

    A couple of things I've done that show how I'm techie enjoying myself:
    1. Enabling email verification was easy -  mainly thanks to Kevin at http://thekevincode.com/2010/09/adding-email-confirmation-to-asp-net-mvc/ 

    2. Creating enum based drop down select boxes was easy - just used http://blogs.msdn.com/b/stuartleeks/archive/2010/05/21/asp-net-mvc-creating-a-dropdownlist-helper-for-enums.aspx - then in Razor:

    @using MyExtensionsNamespace

    Some HTML

    @for (var i = 0; i < Model.MyProperty.Count; i++)
    <div>
    @Html.EnumDropDownListFor(model => model.MyProperty[i].MyEnumField)
    </div>
    }

    3. Using list collections was easy - see the snippet above - it just works when you then collect the ViewModel back in to the Controller Action method too :)

    4. Adding my own Model error was easy with ModelState.AddModelError 

    Thursday, November 25, 2010

    Getting Windows Phone 7 development working

    So I managed to borrow a Windows Phone 7 for some dev work :)
     
    I already had the basic tools installed and had been using the emulator.

    But to get the phone dev working...
    - you need to install the October 2010 update to the tools
    - you need to download and install the zune PC software (60MB download and requires reboot!)
    - you need to log on inside the zune PC software
    - you need to use the "Windows Phone Developer Registration tool" to unlock your phone

    Some more help here:

    The good news is... this all worked :) Took about 2 hours to get going, most of that spent downloading Zune stuff!

    Wednesday, November 24, 2010

    Razor outputting raw unescaped HTML.. ASP.Net MVC3

    In general I love the look and feel of the new Razor View Engine - works really well.

    However if you ever want to output raw HTML then it's a bit ugly - e.g. to output some JSON I did:

    var currentConnectionStrings = @(new HtmlString(Json.Encode(Model.ConnectionStrings)));

    It would be nice if this instead did something like @=Json.Encode(Model.ConnectionStrings) - or some other "special" character combination

    JQuery - responding to selected tab using tabselect

    If you have a standard jquery UI tabs setup:

                <div id="tabs-project">
               <ul>
               <li><a href="#tabs-project-core">Details</a></li>
               <li><a href="#tabs-project-connections">Connections</a></li>
               <li><a href="#tabs-project-alerts">Alerts</a></li>
               </ul>
               <div id="tabs-project-core">
                       etc...             
                </div>
               </div>

    And you want to determine which tab has been pressed by name (not index - because index might changed when code gets moved or if tabs get inserted..) then you can use:
     
            function getIndexOfTabAnchor(name) {
                return $("a[href*=#" + name + "]").parent().index();
            } 
            $("#tabs-project").tabs({
                select: function (event, ui) {
                    if (ui.index == getIndexOfTabAnchor('tabs-project-jobs')) {
                        // do jobs stuff...
                    }
                    else if (ui.index == getIndexOfTabAnchor('tabs-project-schedules')) {
                        // do schedules stuff...
                    }
                }
            });

    Note that getIndexOfTabAnchor() uses http://api.jquery.com/attribute-contains-selector/

    Awesome comparison of the costs and benefits of Azure versus AWS

    I've just stumbled upon this awesome analysis of some Azure versus AWS services.

    I think there are several strategic and architectural things to consider when choosing between the two services - choosing to deploy roles on Azure is an application lifestyle choice compared to choosing OS deployment and system admin on EC2 - and both lifestyles have their problems!

    There's lots of good material in here:
    http://compositecode.com/2010/11/01/cloudthrowdown-part1/

    Monday, November 22, 2010

    PivotViewer, shared images and memory problems

    I've spent a large chunk of the last weekend playing with the Silverlight PivotViewer from Microsoft.


    It's lovely!

    Despite the official upper limit on number of items being "3000", I've taken it up as high as 73000 items now - the animations don't work at that level, but the pivotviewer is very very useful there!

    You can see some screenshots (attached hopefully!) and there's also one "live" demo on http://zoom.runsaturday.com

    The only downside - there have been a few issues I've had - one of the key ones of which is:
    • if you create a .cxml collection in which items don't each have unique images (e.g. they share a default image) then the Silverlight control does something odd - it somehow seems to try to duplicate the shared images and uses up lots of CPU and memory as a result.
    This bug is reported in a bit more detail on this thread - http://forums.silverlight.net/forums/t/210211.aspx:

    To recap:

    • the bug is for collections where <Item>s share img references.
    • if I load up one of these collections in the standalone viewer these work fine
      - the load is smooth and the memory use is expected.
    • if I load up the same collection in the silverlight pivotviewer then the items with shared images
      take a long time to display (they pop into the display one by one) and the memory use is higher
      - it increases with each image loaded.

    You can just about see this effect if you watch the swimmer icons when you first load up:

    The effect is really really noticable when you get to larger data sets

    Sunday, November 21, 2010

    Numpty Silverlight DataBinding continues....

    So if you are doing Silverlight or WPF and you're trying to do MVVM... and you sit there wondering why your ObservableCollection<T> just isn't working...

    ... then consider it might just be....

    ... because you are an idiot and you declared your code:

    public ObservableCollection MyCollection;

    when it should of course have been...

    public ObservableCollection MyCollection {get;set;}

    Arggggggggggggggh! Serves me right for coding on Sunday night! 

    Some hacky fixes to PAuthor PivotViewer library

    Overall the PAuthor project for creating PivotViewer for Silveright datasets works great - I've used in in .Net4 for 4 separate command line apps :)

    However, when working on some large data sets (around 75000 Items) I found the following problems - mostly to do with corrupt items in my input.

    Hope this post helps other people...

    Problems seen in collectionCreator.Create(m_dziPaths, m_dzcPath); in ParallelDeepZoomCreator.cs

    • System.ArgumentException in ParallelDeepZoomCreator.cs due to problems in my item names (some of my names included the "*" wildcard character - doh!
    • And System.IO.FileNotFoundException in ParallelDeepZoomCreator.cs due to unknown problem (no real clue why these particular folders didn't exist - it was 8 out of 75000...)

    To fix these I just wrapped the call with: 

                 List<String> toRemove = new List<string>();             foreach (var c in m_dziPaths)             {                 try                 {                     System.Security.Permissions.FileIOPermission f = new System.Security.Permissions.FileIOPermission(                             System.Security.Permissions.FileIOPermissionAccess.AllAccess, c);                 }                 catch (ArgumentException exc)                 {                     toRemove.Add(c);                     System.Diagnostics.Trace.WriteLine("INVALID PATH " + c);                 }             }             foreach (var c in toRemove)             {                 m_dziPaths.Remove(c);             }              while (true)             {                 try                 {                     DZ.CollectionCreator collectionCreator = new DZ.CollectionCreator();                     collectionCreator.Create(m_dziPaths, m_dzcPath);                     break;                 }                 catch (System.IO.FileNotFoundException exc)                 {                     System.Diagnostics.Trace.WriteLine("STUART - SORRY - REMOVING " + exc.FileName);                     m_dziPaths.Remove(exc.FileName);                 }             } 


    Some multithreaded problem seen in image download

    The finalizer for PivotImage.cs occasionally sees IO exceptions in the File.Delete operation - the exception claims that the fie is currently open in another process.

    Not sure what is causing this - my guess is it's a multithreading issue of some description.

    To fix (mask) this I simple added a try catch to the finalizer:


            ~PivotImage()         {             if (m_shouldDelete == false) return;             if (File.Exists(m_sourcePath) == false) return;              try             {                 File.Delete(m_sourcePath);             }             catch (IOException)             {                 System.Diagnostics.Trace.WriteLine("Some clean up needed " + m_sourcePath);             }         } 

    That's it - hope it helps someone.

    Saturday, November 20, 2010

    Using the Silverlight PivotViewer Control and seeing NotFound when deployed on IIS

    If you build and test your lovely new PivotViewer SL app and then when
    you deploy to a full IIS server you see "NotFound" reported for your
    collection, then...

    Make sure you've set the mimetype for .cxml, .dzi, .dzc all as
    text/xml within IIS for your current website.

    Worked for me :)

    Helpful suggestion found on :
    http://forums.silverlight.net/forums/p/194861/452655.aspx

    Friday, November 19, 2010

    Silverlight compressed binary serialization

    So... I did my first Silverlight app - very good fun (although there was a fair amount of head scratching too!)

    As part of this app I needed to send some fairly chunky bits of data to the client. To do this I used http://whydoidoit.com/silverlight-serializer/ and http://slsharpziplib.codeplex.com/

    On the server (in an offline process) the code that generated my compressed files looked like:

                using (var f = File.Create(zipFileName))
                {
                    using (var zipOut = new ZipOutputStream(f))
                    {
                        ZipEntry entry = new ZipEntry("stuff");
                        var toStore = Serialization.SilverlightSerializer.Serialize(myClass);
                        entry.DateTime = DateTime.Now;
                        entry.Size = toStore.Length;
                        zipOut.PutNextEntry(entry);
                        zipOut.Write(toStore, 0, toStore.Length);
                        zipOut.Finish();
                        zipOut.Close();
                    }
                }

    And in the Silverlight client the code looked like:

            public void LoadFrom(Uri url)
            {
                HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
                request.BeginGetResponse(new AsyncCallback(ReadCallback), request);
            }

            private void ReadCallback(IAsyncResult asynchronousResult)
            {
                HttpWebRequest request = (HttpWebRequest)asynchronousResult.AsyncState;
                HttpWebResponse response = (HttpWebResponse)request.EndGetResponse(asynchronousResult);
                using (ZipInputStream zipIn = new ZipInputStream(response.GetResponseStream()))
                {
                    ZipEntry entry;
                    if ((entry = zipIn.GetNextEntry()) != null)
                    {
                        var myClass = Serialization.SilverlightSerializer.Deserialize<TheClass>(zipIn);
                        if (myClass == null)
                        {
                            MessageBox.Show("Some error has occurred - unable to read data - sorry!");
                            return;
                        }

                        Dispatcher.BeginInvoke(() =>
                            {
                               // do stuff with myClass here
                            });
                    }
                }
            }

    It all worked surprisingly well :)

    Thursday, November 18, 2010

    More problems with SQL Azure backup


    you can backup SQL Azure using
    CREATE DATABASE [newName] *BUG HERE* COPY OF [oldName]

    However, recently I've just hit lots of problems with doing this on "The CREATE DATABASE statement must be the only statement in the batch."

    Sadly I don't seem to be able to get past this... I think it's something to do with names? Certainly it's an odd (and probably misleading) message... Anyways... I'll use the other backup method instead...

    NEWSFLASH!!!!
    The problem was my typo...

    The statement is not:
    CREATE DATABASE [newName] COPY OF [oldName]

    Instead it is:
    CREATE DATABASE [newName] AS COPY OF [oldName]

    Doh! (although top marks to SQL Azure for the error message)

    Tuesday, November 16, 2010

    A look at some Silverlight control libraries

    As I start serious Silverlight development for the first time, I've been looking at and evaluating some of the control libraries out there.

    Why? Well, because they contain huge amounts of slick, useful, QA'd functionality which will accelerate my development - so they represent superb *value* for money.

    Here are some of my notes so far:
    • Telerik - have a huge set of controls including rich text editing, super framework libraries (like containers and transitions), and lots and lots of charting. I've attended a couple of their webinars and clearly they've got good functionality now, plus lots more coming every quarter.
    • Infragistics - a fairly large set of functionality again including grids, menus and a docking framework
    • Dundas - seems to be mainly focussed on Business Intelligence (BI) dashboards - so lots of charts.
    • Actipro - a much smaller set of functionality focussed around an MsDev like code editor.
    So far... I'm tempted to buy Telerik with the intention of using:
    - the charting - especially the lovely sparklines :)
    - the grid - everyone needs one eventually!
    - the overall framework stuff - especially as they use MVVM everywhere.

    I will install the trial version soon and then come back with more feedback.

    Tuesday, November 02, 2010

    Serialising Enums with Newtonsoft JSON.Net

    For a while... I was using this serializer:

    public class JsonEnumConverter<T> : JsonConverter
    {
    public override bool CanConvert(Type objectType)
    {
    return objectType == typeof(T);
    }

    public override void WriteJson(JsonWriter writer, object
    value, JsonSerializer serializer)
    {
    writer.WriteValue(((T)value).ToString());
    }

    public override object ReadJson(JsonReader reader, Type
    objectType, object existingValue, JsonSerializer serializer)
    {
    return Enum.Parse(typeof(T), reader.Value.ToString());
    }
    }

    This allowed me to serialize/deserialize using attributes like:

    [JsonObject]
    public class MyClass
    {
    [JsonConverter(typeof(SerializationUtils.JsonEnumConverter<DateTimeTestFormat>))]
    public DateTimeTestFormat DateTimeTestFormat { get; set; }
    }

    However, eventually I worked out that I could just use a more general
    solution - and that I could pass that in to the top-level JsonConvert
    methods:

    public class GeneralJsonEnumConverter : JsonConverter
    {
    public override bool CanConvert(Type objectType)
    {
    return objectType.IsEnum;
    }

    public override void WriteJson(JsonWriter writer, object
    value, JsonSerializer serializer)
    {
    writer.WriteValue(value.ToString());
    }

    public override object ReadJson(JsonReader reader, Type
    objectType, object existingValue, JsonSerializer serializer)
    {
    return Enum.Parse(objectType, reader.Value.ToString());
    }
    }

    Much easier... :)

    Monday, November 01, 2010

    Some practical experience of GPS battery life for Sports Tracking apps

    On Navmi/RunSat, I got asked:

    >> What's the impact on the battery when doing frequent GPS resolutions?

    The answer is battery life varies dependent on phone...

    On an iPhone you have to really mess about to improve battery life -
    the only way of doing it on older phones is to start playing out audio
    (e.g. a playlist) - then you can turn the screen off but still keep
    your app alive and capturing GPS.

    On my WM6.5 Omnia II, it's easy to keep the app/GPS alive but to save
    some power. If I do that and still monitor the GPS, then up to 6 hours
    battery life is OK - so pretty good.

    On a Bada Wave, there's a SYSTEM_SERVICE level call to make so that
    the app can carries on even when the screen is off - locked I can
    manage 3-4 hours easy - not tried any longer yet - but suspect 6 is
    about the current limit.

    On my Nokia 5800, it seems that the Nokia ST app managed 2-3 hours OK
    - not tried it any longer.

    On Android, it's too early for me to say - but I know that the Google
    MyTracks app can manage long sessions - 6 hours+ - although they do
    have some sample-rate dropping code to do that - basically it depends
    what your activity is:
    - if you're walking, then you can probably get away with sampling only
    every few minutes
    - but if you ever want to do turn-by-turn directions in a car, then
    you'll need to up the sampling rate to something much more frequent.

    Friday, October 29, 2010

    PDC 2010 Updates - First Look

    The main updates from the PDC weren't that exciting... I haven't seen anything on Azure storage yet - hopefully soon...

    The coolest thing I saw was Anders talking about c#5 - async and await - think linq for async tasks - see http://blogs.msdn.com/b/ericlippert/archive/2010/10/28/asynchrony-in-c-5-part-one.aspx and download CTP from http://msdn.microsoft.com/en-gb/vstudio/async.aspx - I liked :) Full video available on the pdcplayer... probably too much effort to find it!

    The main things that might impact some of the projects that I'm working on were:
    - a much better Azure management portal is on the way - at last!
    - full IIS7 hosting - so you can set custom properties and so your web role can host multiple web projects :)
    - Remote Desktop access to your web and worker role instances (not sure it's that useful to my projects but it made me feel happier)
    - Velocity (Memcached) will be available soon
    - Java will become a "first class citizen" ?!
    - Extra small instances - I suspect that most of my worker roles could move to these

    Azure overview is here:
    SQL Azure updates are here:

    Will keep an eye on some of the sessions tonight - especially Jai's deep dive into storage.

    Stuart

    Wednesday, October 27, 2010

    How to do "TOP N" in nhibernate's query language

    How to do "TOP N" in nhibernate's query language...

    Seems like you can't - so instead you have to use:
    - SetMaxResults(N)
    on the IQuery object.

    With help from:
    http://stackoverflow.com/questions/555045/nhibernate-hqls-equivalent-to-t-sqls-top-keyword

    Tuesday, October 26, 2010

    The 5 second guide to adding WCF Client code

    In case you ever have to blag some WCF client code ever...

    1. Right click on your project
    2. Choose "Add Service Reference"
    3. In the dialog, enter the service URL wait for the "discovery" to happen, then change the namespace name and hit OK
    4. In the client code just call the service - like
                    var client = new MyNameSpace.MyServiceClient();
                    int result = client.Method(param1, param2, param3);

    Simples ;)

    Tuesday, October 19, 2010

    Converting datetimes to just the date (and to pretty text) in SQLServer

    From http://stackoverflow.com/questions/113045/how-to-return-the-date-part-only-from-a-sql-server-datetime-datatype

    DATEADD(dd, 0, DATEDIFF(dd, 0, [Datetime])) AS TheDate,

    Also, this helps for text formatting

    CONVERT(nvarchar, TheDate, NNN)
    101 - 10/19/2010
    102 - 2010.10.19
    103 - 19/10/2010
    104 - 19.10.2010
    105 - 19-10-2010
    106 - 19 Oct 2010
    107 - Oct 19, 2010
    108 - 00:00:00
    109 - Oct 19 2010 12:00:00:000AM
    110 - 10-19-2010
    111 - 2010/10/19
    112 - 20101019
    113 - 19 Oct 2010 00:00:00:000

    Monday, October 18, 2010

    Parsing DateTime's into UTC

    This article proved helpful - http://www.mindthe.net/devices/2008/05/23/tryparseexact-and-utc-datetime/

    But actually I think the answer is to use BOTH AssumeUniversal AND AdjustToUniversal
                DateTime.TryParseExact(group.Value, this.customParseText, CultureInfo.InvariantCulture, System.Globalization.DateTimeStyles.AdjustToUniversal | System.Globalization.DateTimeStyles.AdjustToUniversal, out dateTime);

    Friday, October 15, 2010

    Datetime comparison for Azure Table Storage REST API


    string.Format("{0} le datetime'{1}'", PropertyName, XmlConvert.ToString(cutoffDate, XmlDateTimeSerializationMode.RoundtripKind))

    although obviously this will then need to be UrlEncoded

    Wednesday, October 13, 2010

    Get a RelativePath string

            public string GetRelativePath(string rootPath, string fullPath)
            {
                Uri uri1 = new Uri(rootPath);
                Uri uri2 = new Uri(fullPath);

                Uri relativeUri = uri1.MakeRelativeUri(uri2);

                return relativeUri.ToString();
            }

    Test if a path is a Directory?

            public bool IsDirectory(string path)
            {
                try
                {
                    FileAttributes attr = System.IO.File.GetAttributes(path);
                    if ((attr & FileAttributes.Directory) == FileAttributes.Directory)
                    {
                        return true;
                    }
                    return false;
                }
                catch (FileNotFoundException)
                {
                    return false;
                }
                catch (DirectoryNotFoundException)
                {
                    return false;
                }
            }

    Monday, October 11, 2010

    If you see Error 32 Internal Compiler Error: stage 'COMPILE' ....

    If you see Error 32 Internal Compiler Error: stage 'COMPILE' ....

    then what caused it for me...

    was accessing a {get; private set;} variable

    Removing that allowed the compile to proceed again

    If you're looking for quick and effective free training on M$oft tools

    This resource looks excellent:
    http://channel9.msdn.com/Learn/Courses/

    Then you can also play with code from:

    There's so much resource out there!

    Sunday, October 10, 2010

    If you are using AutoFac and you see... "are you missing a using directive or an assembly reference" when using AutoFac

    e.g if you see:

    'Autofac.ContainerBuilder' does not contain a definition for 'RegisterAssemblyTypes' and no extension method 'RegisterAssemblyTypes' accepting a first argument of type 'Autofac.ContainerBuilder' could be found (are you missing a using directive or an assembly reference?)

    or 
    'Autofac.ContainerBuilder' does not contain a definition for 'RegisterInterface' and no extension method 'RegisterInterface' accepting a first argument of type 'Autofac.ContainerBuilder' could be found (are you missing a using directive or an assembly reference?)

    or....

    then check that you are using "using Autofac" at the top of the file

    if you don't include this then you won't get the necessary extension methods inside your file's visible namespaces.

    Friday, October 01, 2010

    Back to basics - simple Azure table test results part 2

    Another test - what happens if you try concurrent updates.... breaking the optimistic concurrency locking

            static TableServiceContext GetContext()
            {
                CloudTableClient client = new CloudTableClient(
                    "https://***.table.core.windows.net",
                    new StorageCredentialsAccountAndKey("***", "****"));

                //client.CreateTableIfNotExist("TestData");
                var context1 = client.GetDataServiceContext();

                return context1;
            }

            static Thing GetFredFromContext(TableServiceContext context)
            {
                var query = context.CreateQuery<Thing>("TestData");
                var subquery = from t in query
                               where t.PartitionKey == "Fred"
                               && t.RowKey == "Bloggs"
                               select t;

                var element = subquery.First();
                return element;
            }

            static void Main(string[] args)
            {
                var context1 = GetContext();
                var context2 = GetContext();

                var one = GetFredFromContext(context1);
                var two = GetFredFromContext(context2);

                one.TestField = "test2";
                two.TestField = "test3";

                context1.UpdateObject(one);
                context2.UpdateObject(two);

                context1.SaveChanges();

                try
                {
                    context2.SaveChanges();
                }
                catch (Exception e)
                {
    // exception is caught here - 
                }
            }


    The exception is System.Data.Services.Client.DataServiceRequestException
    The InnerException is [System.Data.Services.Client.DataServiceClientException] with StatusCode of 412 - from http://msdn.microsoft.com/en-us/library/dd179438.aspx

    UpdateConditionNotSatisfied

    Precondition Failed (412)

    Back to basics - simple Azure table test results

    What happens if you try to create an object twice:

                CloudTableClient client = new CloudTableClient(
                    "https://thing.table.core.windows.net",
                    new StorageCredentialsAccountAndKey("thing", "****blurb******"));

                client.CreateTableIfNotExist("TestData");
                var context1 = client.GetDataServiceContext();

                Thing t = new Thing();
                t.PartitionKey = "Fred";
                t.RowKey = "Bloggs";
                t.TestField = "test";

                context1.AddObject("TestData", t);
                var response = context1.SaveChanges();

                context1.Detach(t);

                context1.AddObject("TestData", t);
                response = context1.SaveChanges();

    the answer... is that you get an exception thrown in the second "SaveChanges()" - the exception is System.Data.Services.Client.DataServiceRequestException - with an InnerException of type DataServiceClientException -with StatusCode 409 - EntityAlreadyExists 

    Thursday, September 30, 2010

    Azure Storage Performance

    From a presentation by Brad Calder - Microsoft PDC 2009 


    Per Object/Partition Performance at Commercial Availability

    Throughput

    Single Queue and Single Table Partition

    Up to 500 transactions per second 


    Single Blob

    Small reads/writes up to 30 MB/s

    Large reads/writes up to 60 MB/s


    Latency

    Typically around 100ms (for small trans)

    But can increase to a few seconds during heavy spikes while load balancing kicks in

    Tuesday, September 21, 2010

    A really good explanation of SQL Server and indexes

    It didn't answer my current question, but it was good to read this again:
    http://www.sqlteam.com/article/sql-server-indexes-the-basics

    For reference: my current question was how to optimise a query which does a > comparison on two columns.... I can see how to do it using a computed column with an index, but that's not available to me at present - as I'm using Fluent nhibernate...

    This question helped a bit "What happens when a function is used in the WHERE clause? In that case, the index will not be used because the function will have to be applied to every row in the table."

    Sunday, September 19, 2010

    When your SQL log (ldf) file gets Huge...

    If you're having problems with large SQL log files, then this was the most useful advice I found.

    For my purposes I was a "no" - my data wasn't money sensitive... so I switched to "simple" and could then shrink my database using the Tasks right-click menu in SSMS


    Do you make transaction backups?
    Yes -> Do not shrink the log unless you have done some exceptional operation (such as a massive delete) as the "cost" of SQL Server re-growing the Log file is significant, and will lead to fragmentation and thus worse performance.
    No and I don't want to -> Change the Recovery Model to "Simple" - Enterprise Manager : Right click database : Properties @ [Options]
    Don't know -> Transaction log backups allow you to recover to a "point in time" - you restore the last Full Backup (and possibly a subsequently Differential backup) and then every log backup, in sequence, until the "point in time" that you want to roll-forward to.


    Saturday, September 18, 2010

    Timing tests on Azure

    In order to test the performance of your app on Azure, you really need to test on Azure.

    This is especially the case when your app needs to talk to local SQL Azure, Table, Blob, and Queue resources - you can't test the performance of these:
    - when you are running the storage on the DevFabric
    - when you are running on the real storage, but with your app on a dev machine outside of the data center

    When you are running on Azure, then how do you get your results out.

    Well, to start with I've just used some simple tests using System.Diagnostics.StopWatch:

    At the start of the method to be tested
                List<KeyValuePair<string, long>> times = new List<KeyValuePair<string, long>>();
                times.Add(new KeyValuePair<string, long>("Original", 0L));
                System.Diagnostics.Stopwatch timer = new System.Diagnostics.Stopwatch();
                timer.Start();

    Half way through:
                times.Add(new KeyValuePair<string,long>("namedPoint", timer.ElapsedMilliseconds));

    And at the end:
                times.Add(new KeyValuePair<string, long>("timeToEnd", timer.ElapsedMilliseconds));
                timer.Stop();

                foreach (var t in times)
                {
                    System.Diagnostics.Trace.TraceError(string.Format("{0}:{1}", t.Value, t.Key));
                }

    This then uploads the results to Azure Diagnostics - currently Trace is routed to Table storage.

    Obviously this is only a temporary solution - only suited for simple methods, and you have to remove this code for Production.

    Friday, September 17, 2010

    Azure Storage Tools

    A useful list from http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx

    Personally I quite like https://www.myazurestorage.com/ although I've also used http://www.cerebrata.com recently

    Windows Azure Storage 
    Explorer

    Block Blob

    Page Blob

    Tables

    Queues

    Free

    Azure Blob Client

    X

    Y

    Azure Blob Compressor 
    Enables compressing blobs for upload and download

    X

    Y

    Azure Blob Explorer

    X

    Y

    Azure Storage Explorer

    X

    X

    X

    Y

    Azure Storage Simple Viewer

    X

     

    X

    X

    Y

    Cerebrata Cloud Storage Studio

    X

    X

    X

    X

    Y/N

    Cloud Berry Explorer

    X

    X

    Y

    Clumsy Leaf Azure Explorer 
    Visual studio plug-in

    X

    X

    X

    X

    Y

    Factonomy Azure Utility

    X

    Y

    Gladinet Cloud Desktop

    X

    N

    MyAzureStorage.com 
    A portal to access blobs, tables and queues

    X

    X

    X

    X

    Y

    Space Block

    X

    Y

    Windows Azure Management Tool

    X

    X

    X

    X

    Y


    Azure Links

    Credit to Neil Mackenzie for this list - fromhttp://nmackenzie.spaces.live.com/blog/cns!B863FF075995D18A!706.entry

    Azure Team

    http://blogs.msdn.com/b/azuredevsupport/ – Azure Support Team

    http://blogs.msdn.com/b/dallas/ – Microsoft Codename Dallas

    http://blogs.msdn.com/b/sqlazure/ – SQL Azure Team

    http://blogs.msdn.com/b/cloud/ - Windows Azure Developer Tools Team

    http://blogs.msdn.com/b/windowsazurestorage/ – Windows Azure Storage Team

    http://blogs.msdn.com/b/windowsazure/ – Windows Azure Team

    Individual

    http://blogs.msdn.com/b/partlycloudy/ – Adam Sampson

    http://bstineman.spaces.live.com/blog/ – Brent Stineman (@BrentCodeMonkey)

    http://blog.structuretoobig.com/ – Brian Hitney

    http://channel9.msdn.com/shows/Cloud+Cover/ – Cloud Cover (@cloudcovershow)

    http://www.davidaiken.com/ – David Aiken (@TheDavidAiken)

    http://davidchappellopinari.blogspot.com/ – David Chappell

    http://geekswithblogs.net/iupdateable/Default.aspx – Eric Nelson (@ericnel)

    http://gshahine.com/blog/ – Guy Shahine

    http://blogs.msdn.com/b/jmeier/ – J.D. Meier

    http://blogs.msdn.com/b/jnak/ – Jim Nakashima (@jnakashima)

    http://blog.maartenballiauw.be/ - Maarten Balliauw

    http://nmackenzie.spaces.live.com/blog/ Neil Mackenzie (@mknz)

    http://azuresecurity.codeplex.com/ - Patterns & Practices: Windows Azure Security Guidance

    http://oakleafblog.blogspot.com/ – Roger Jennings (@rogerjenn)

    http://dunnry.com/blog/ – Ryan Dunn (@dunnry)

    http://scottdensmore.typepad.com/blog/ – Scott Densmore (@scottdensmore)

    http://blog.smarx.com/ - Steve Marx (@smarx)

    http://blogs.msdn.com/b/sumitm/ - Sumit Mehrotra

    http://azure.snagy.name/blog/ – Steven Nagy (@snagy)

    http://blog.toddysm.com/ - Toddy Mladenov (@toddysm)

    Azure Status

    http://www.microsoft.com/windowsazure/support/status/servicedashboard.aspx – Azure Status

    Other Clouds

    http://highscalability.com/ – High Scalability

    http://perspectives.mvdirona.com/ – James Hamilton

    http://www.allthingsdistributed.com/ – Werner Vogels