End of an Era

The end has actually been and gone, however it took a while to come out from under my stone.

Earlier this month, I finally submitted my MSc dissertation which dealt with barriers to entry into the Cloud Computing user market for Small and Medium Enterprises (SMEs). The results will appear here when they have been ratified by the Open University when/if I pass. Though I wasn’t sweating on the final deadline when it came, the last week or 10 days were certainly a bit nerve-wracking – especially when I realised I had the order of the document slightly wrong and had to hastily swap that round!

The final edit came in the days following my sister’s wedding in the window of the bar of the Red Barn in Woolacombe Bay, Devon, overlooking the beach with the Isle of Lundy hazy in the distance, which most certainly helped matters as the staff kept me handily supplied with coffee and cake.

Hopefully this means I can blog more regularly with stuff that I hope will prove genuinely beneficial to people. It’s been a daunting quest over the last 13 months to get to this point but one which I would still recommend to anyone, whether it is for personal fulfillment or to improve your career prospects. In my case it was a bit of both, and it helped that I had a supportive employer who was prepared to give me financial aid as long as my studies were undertaken, for the most part, outside of normal hours.

For those considering undertaking a similar project, here are my words of advice, born out of experience!

  • Think very carefully about doing higher-level research if you anticipate any sort of major life change at the same time; marriage, baby, new job, moving house. I suddenly found myself in charge of my department at work half way through the project and this meant the secondary research was hard to schedule and complete. There are examples of people doing such things whilst juggling all manner of experiences but these are the exception, rather than the rule.
  • The project proposal is key – make sure it is feasible in the amount of time you have available, ensure that no-one else has done your research already, and that it is scoped properly. 13 months is enough time to make a modest contribution to academic debate, not to rewrite the script on a topic entirely.
  • Develop a plan and stick to a schedule – be realistic about your objectives and how long it will take. Make sure you revisit the plan regularly to ensure you are on track. If possible, try to get ahead of the plan by around two weeks and keep two weeks’ leave in reserve at work in case you require this to get over the line at the end.
  • Have a backup plan in case your primary research method does not work out the way you expected. If your survey does not garner enough responses, or that software you were going to write has already been written, then have something up your sleeve to either augment or replace your original method.
  • Have electronic backups of everything! Make sure that you do not get too confused and overwrite what has gone before – develop a filing structure and number revisions sequentially. If you feel it is necessary, invest in some version control – I had occasion to revert back to a previous version of my document, and luckily my Dropbox and Mozy accounts both provided me with this functionality.
  • Remember you can speak to anyone, literally anyone. Even industry experts. Chances are they will be more interested in what you are doing than anyone else, including the more established industry representative bodies.
  • Invest in a referencing tool. I used Endnote and was pretty impressed with its range of functionality. The time saving involved in picking up the references direct from online academic databases, and re-using them in the house style inside my Word document was immense – no index cards and cross-referencing manually. RefWorks with Firefox is also an option if you do not want to invest in RefWorks.
  • Make time for other things outside the research – for family, your real, paid work, going to the football, anything. Find a working regimen that works for you, and a space you can work in – not all people can work in a Library, sometimes its easier to do it at home, or in the shed.

And finally, realise that whilst the research may be your passion, it may not be everyone else’s acknowledge those who have been patient with you on the journey, and do something nice for them when it is all over!

SQL Server Denali / 2012 Licencing

Microsoft are changing the way SQL Server is licensed when SQL Server 2012 (aka “Denali”, currently at Release Candidate stage) is launched, probably early 2012.

Previously, SQL Server has always been licensed on a per-user or a per-processor basis. As processor technology has moved on to Dual Core and eventually Quad Core, this has left Microsoft presumably feeling a bit peeved at the fact customers can run more processing power than what they would otherwise appear to be physically licensed for.

Looking ahead to Denali then, they have closed the loop, and the next version of SQL will be licensed either on a per-core or per-user basis. For those of us who run physical machines, this means that a minimum of 4 cores will need to be licensed per box.

There are also now just 3 editions, not counting Express – Standard, Business Intelligence, and Enterprise. Core licencing is not available on BI edition, and the Datacenter Edition is now gone.

So what do we get in each version? As we can see below, for those who want to exploit some of these new BI tools, such as Power View, or the already existing PowerPivot for Excel, BI or Enterprise Edition is the way to go. As per 2008, High Availability (clustering) is an Enterprise Edition scenario.

The last, interesting item is the addition of a “Cloud on your Terms” feature, essentially a cloud-bursting technology which allows the use of extensible data storage on the Azure platform, definitely an interesting item, which might put pay to the idea that transactional databases are not easily migrated into the Cloud.

I’m going to be interested to see what my LAR comes back with as regards comparative costs for core vs processor licencing, and what this potentially means from the need to consolidate our estate.

My Cloud is Better than Your Cloud

To paraphrase Tim Booth “My Cloud is better than your cloud, I said my Cloud, is better than your Cloud, my Cloud is the ONLY WAY”.

“Beware of the false Cloud!” said Larry Ellison, CEO of Oracle last month, when he finally announced that Oracle would be developing a Cloud Computing Service to encompass many of the corporation’s service offerings. Ellison of course, has had some strident opinions regarding Cloud Computing juggernaut in the last few years, prompting him to declare in 2008 that the industry had an “Emperor’s New Clothes” complex when it came to the cloud:

 “The interesting thing about cloud computing is that we’ve redefined cloud computing to include everything that we already do. I can’t think of anything that isn’t cloud computing with all of these announcements. The computer industry is the only industry that is more fashion-driven than women’s fashion. Maybe I’m an idiot, but I have no idea what anyone is talking about. What is it? It’s complete gibberish. It’s insane. When is this idiocy going to stop?

He also predicted at the time..

“We’ll make cloud computing announcements. I’m not going to fight this thing. But I don’t understand what we would do differently in the light of cloud.”

Sure enough, last month, he finally announced that Oracle would be developing an umbrella cloud service for its most popular offerings, but warned of “the false cloud”, referring to the problem of interoperability and vendor lock-in that is frequently cited as a major detracting factor for adopting organisations.

Salesforce.com is an organisation that comes in for major criticism by Ellison, for being what he describes as “the ultimate vendor lock-in” since applications developed on the company’s Force.com PaaS offering are not interchangeable between Salesforce and the offerings of other providers. This lack of “standards” and interoperability is suggested to disqualify Salesforce offerings from being a “true cloud”. In fact, it is quite possible to export information from Salesforce, but not to lift up entire applications and deposit them on another cloud service. Does this mean it is not a “true cloud”? I could develop a Java application and deploy that in the Oracle Cloud, but it will not port to another Cloud Service unless that is also running Java.

This is interesting to me from a definition standpoint. Just exactly what IS cloud? Is it public? Is it private? Is it that fuzzy area in between? For our organisation it certainly will be – we have assets that we will not put in the cloud for the forseeable future – they are too mission-critical, too inimitable, to asset-specific to the organisation. We have other elements of our infrastructure that are tailor-made for cloud deployment, or where the cloud can provide us with ready-made SaaS solutions we can more easily extend than build from scratch.

I tend to think of cloud as having the following essential characteristics:

  • Low commitment levels, typically only as much as one-month contracts between provider and user.
  • Payment for only as much of the service as the user consumes (often referred to as a Pay-as-you-Go, or PAYG model).
  • Rapidly provisioned services, often by the user themselves, with minimal interference from the provider, on an automated basis.
  • The illusion of virtually infinite scalability through server virtualisation.
  • Accessible from virtually any Internet-enabled device, from any location.
  • Dynamically and uniformly updated, upgraded, hardened and patched, without intervention by the user, and with minimal, automated intervention by the provider.

This is a definition that isn’t dependent on any one service model or delivery model. We can talk about Business Process as a Service, Domain Specific Clouds, Purpose-Specific Clouds and a thousand other variations, but people want a definition that works.

For me, the closest is still the NIST definition – which is clearly understandable to anyone with a basic business knowledge of computing. Anytime anyone asks me “what is the cloud” I am tempted to answer with an Ellison-esque “It’s whatever you want it to be” but more sensibly, the answer lies, for the moment in a definition that fits those essential characteristics defined above.

Windows 8 Task Manager

OK, I have had the collywobbles about Windows 8 and its radical new interface to say the least since I saw the previews coming out of the BUILD conference. The Windows Phone-esque design really split opinion in our department, the gadget-heads and the gamers loving it, the developers and code-junkies less so.

For the code-junkies and those who really want that fine-grained insight into what is going on in their systems, the improvements to the Task Manager being integrated into Windows 8 maybe are an example of where we should have a little more confidence in our friends from Redmond.

There is a great blog post about the Task Manager improvements here, along with a little preamble that quite nicely summarises the various incarnations of this more or less ever-present tool from 3.1 to present day (I must admit to feeling a little twinge of nostalgia at seeing the NT dialog again – my first IT job being the management of an NT4.0 network).

The highlights for me are The Service Host details – having an insight into that mysterious entity is a really great improvement – and the colour-grading of apps depending on what is eating your CPU consumption is a nice touch so that it stands out like a sore thumb where the problem is on the machine.

Nice work. Maybe I will like Windows 8 after all!

I will make 800 feet

This weekend has all been about injecting some confidence into my dissertation writing process, getting stuff written no matter how hard each and every single word is, getting that damn survey out via every channel I can think of.

Feeling much better about the whole process than I did at the start of the weekend that’s for sure. I’ve been writing exams, essays and tests for 30 years, and in the immortal words of Harry Stamper…

I WILL MAKE 800 FEET

*disclaimer: I know it’s cheese, but sometimes you need a pizzaful.

Social Enterprises

I have spent (most of) today at the Cloudforce 2011 event at the Royal Festival Hall in London, looking at their Sales Cloud and Service Cloud solutions, but the thing that has struck me most is the whole concept of the Social Enterprise that they are embracing.

One statement I found particularly pertinent was “Your customers have embraced the social network, so have your employees, so why hasn’t your business?” We know more about some relative strangers on Facebook and Twitter than we do some of our employees. The guy on your picking line is on his phone, or on his PC in the evenings – he knows a lot about the quality of what you are producing – why aren’t you harnessing that knowledge through the power of social media?

The Keynote was inspiring and merely scratches the surface on this whole “Corporate Revolution” that is about to hit enterprises – a revolution so powerful it hits the front cover of this month’s Forbes magazine. Customers have a powerful tool to tell people exactly what they think of your products and services. And do you know what? Other customers pay attention to what they are saying via these channels. If a prospective customer wants to find out about your company today, do you think they are going to speak to you first or are they going to say to someone using you “What do you think of Company X? What’s your relationship like with them?”. In this age of viral Internet messages, do you really want to piss off someone within this kind of power at their fingertips?

Powerful stuff. If you want to tell me how you are using the cloud to make your small or medium enterprise a social enterprise, how about filling in the cloud survey?

Please Help with my MSc Research (and get a free copy of the results!)

As is detailed on my Research page, I am currently writing an MSc dissertation, the focus of which is to investigate motivations and barriers to the use of cloud computing solutions in Small and Medium Enterprises (SMEs).

The survey is located at http://app.fluidsurveys.com/surveys/cloudresearch/sme-adoption/

If you have responsibility for Information Technology within an SME (or advise one on technology matters), I would be very grateful if you could spend just a little time completing the survey.

If you enter your email address at the end of the process, I will send you a copy of the research, once it is published in March 2012.

Many thanks!

 

To EOM or not to EOM

A relatively new colleague emailed me today (nicely, to thank me for resolving an issue) and the compliment came in the form of the subject line, no body, and ended with the acronym “EOM”.

What does EOM mean? I wondered, and not wishing to demonstrate my ignorance to the person in question turned to the wisdom of others. Turns out EOM means “End of Message” and this here lifehacker entry outlines some very good reasons why you might want to use it.

I am very guilty of writing long and rambling email messages that no one is ever going to get to the end of, and it really lowers the effectiveness of whatever message I am trying to send out. Definately going to give this a go and try and stick by the principle that if you cannot say it in less than two paragraphs, then email is probably not the most effective communication mechanism.

MSDTC and Linked Server Transactions

My transport application gathers information from 5 different OLTP databases to provide a unified view of what is going on logistically throughout our business.

As we roll this out to various new depots, I have encountered a few issues with Distributed Transactions which I thought I would share here. The data load into the central database occurs through SSIS packages running on each local database. The packages on the site and server where the transport database was running executed fine, however I was getting a few errors regarding what are essentially the same packages on the other sites, namely, this error:

“Login failed for user ‘NT AUTHORITY\ANONYMOUS LOGON”

This boils down to the Linked Server security being set incorrectly – make sure this is running under an actual Login SQL Server can use, other than just the “current security context”.

Configuring the linked server properties allowed the SSIS packages to run. However, certain actions in the interface that rely on transactions to exchange data between the OLTP database(s) and the central database would fail with this error:

The transaction manager has disabled its support for remote/network transactions. (Exception from HRESULT: 0x8004D024)

If you want to ship data via transactions between SQL Servers in this manner, then you must configure the Microsoft Distributed Transaction Coordinator (MSDTC) for remote connections, and this is done slightly differently dependent on whether you are running clustered SQL or not.

For Non-Clustered installs:

  1. Go to “Administrative Tools > Component Services”
  2. Go to “Component Services > Computers
    > My Computer” (you may need to double click and wait as some nodes
    need time to expand) > Distributed Transaction Coordinator.
  3. Right-click on the Local DTC entry and go to Properties.
  4. Click “Security” Tab.
  5. Make sure you check “Network DTC Access”, “Allow Remote Client”,
    Allow Inbound/Outbound”, “Enable TIP” (Not required if the client machine and remote machines are in the same domain)
  6. The MSDTC service will restart.

For Clustered Installs:

  1. Go into Failover Cluster Manager.
  2. Expand the Cluster Name
  3. Go to Services and Applications
  4. Right Click on the Clustered MSDTC service and click on “Manage MSDTC”
  5. Go to “Component Services > Computers
    > My Computer” (you may need to double click and wait as some nodes
    need time to expand) > Distributed Transaction Coordinator.
  6. Open the Clustered entries
    Right-click on the DTC entry that corresponds to the clustered service name and go to Properties.
  7. Click “Security” Tab.
  8. Make sure you check “Network DTC Access”, “Allow Remote Client”,
    Allow Inbound/Outbound”, “Enable TIP” (Not required if the client machine and remote machines are in the same domain)
  9. The MSDTC service will restart.

After that, the transactions ran successfully between the different SQL installations on the network.

 

Exchange 2010 Disk Full Disaster Recovery

Oh dear…the Exchange Server’s disk is full and yes, you guessed it, by the time we realised what was happening, it was already too late. Frantic attempts to delete content came to nought (and wouldn’t make any difference anyway because the Mailbox Agent wouldn’t release the space until during the Maintenance Window).

So, the drive ran out of space, and the Mailbox Database Dismounted itself. A common reason for this is that the log files are not being flushed after a backup. In our case, this was taking place, the .edb file however had just got far too big for the drive.

So, what the hell do we do now. There is space on the System Disk but this is on a different RAID array, so shrinking that volume to extend the other one is a no-go. For the record, here is what we did:

  1. Take a file system backup of the database in the dismounted state. This won’t clear the logs but ensures we have something to restore if the next bit goes wrong.
  2. Moved the database onto a removable storage device which is of decent quality (In our case it was a Seagate 500GB Desktop Drive), which should be USB 2.0. This is done through the Exchange Management Console by right clicking on the Database in trouble under Organization Configuration > Mailbox and selecting “Move Database Path”.

Note the original values in the Move Database Path dialog for putting back later.

Needless to say, this is going to take a while. An hour and a half later, we were able to re-mount the database whilst we went and procured more drives.

Exchange will complain terribly about performance whilst the database is hosted on this drive, and we need to get it off here as soon as is practicable. In this instance we purchased 4 x 435GB Seagate Cheetah 15.7K RPM SAS drives. To swap these out from the old array, we did the following:

  1. Dismounted the Database again, dismount the Public Store as well.
  2. Moved the Public Store and the log files onto the System disk, which still had 40GB of space remaining.
  3. Powered off the server.
  4. Replaced the drives.
  5. In the Perc Controller, initialized the drives and configured a RAID 10 array and Virtual disk of 850 GB.
  6. Bring Server Up.
  7. Go into Administrative Tools > Computer Management > Storage Management, initialize the drives if asked to do so with an MBR, and create the new volume, format with NTFS< preferably on the same drive letter as before.
  8. Back into Exchange, and reverse what you did previously, restoring the original paths from your notes.
  9. Mount the databases. If all goes well, and being on the USB Drive did not damage the database, then it will mount, if not, then seek advice on the eseutil command to repair the database file.
  10. The Exchange Server should note the absence of its full-text catalog and recreate one for itself. Note at this point, it will have to trawl the entire store, indexing the content as it goes. Since the RAID array is still initializing its mirror at the moment, do not expect lightning fast performance until these two processes have finished (this could be a considerable number  of hours).

Monitor the state of the server. Pour coffee.