Category Archives: Development

SharePoint Saturday Chicago #SPSChicago

Today I presented a 101 session on getting started with the new SharePoint App Model. It’s a lot to get through in 75mins, but hopefully it gave people enough to get started and try out building an app for Office 365 or SharePoint on premises.

Thanks to everyone that came along!  #SPSChicago is a great free event for people with a great lineup of speakers.

Here are the slides:

Subscriptions for Office and SharePoint apps arrive

One of the biggest issues for developers they face today with various app eco-systems is how to build an on going revenue stream.  This is the problem mobile developers face and why many mobile marketplaces have introduced in app purchases.

In the business world this is even more of a problem. Typically the apps are a lot more complex and require big investment to build, maintain and improve over time. It is simply not possible to build a business model on a single one time payment when someone buys an app.

What app builders need is an on going revenue stream via a subscription model.  It has been by far and away the most asked for revenue feature in the Office/SharePoint store.

Starting today that is possible. Here is the announcement.

The long and short of it is app developers can submit or update their apps to use the subscription model.  As of writing the store doesn’t seem to be offering subscriptions, but i suspect its going live shortly.

Update:  It looks like the only subscription option will be per user per month. Not yearly or something like per tenant per month/year etc…

The implications of this are HUGE.  I have heard from many people who are just not interested in building apps unless they have subscriptions. To date only ISVs with decent size and resources have been able to work around this store limitation by building their own commerce model and do their commerce and licensing outside of the store (which is allowed).  Nintex recently launched Nintex Workflow for Office 365 which does this. Licensing is managed outside of the store and they manage the subscriptions etc… This is also handy because they can deal with much more complex subscription types such as pricing tiers etc…  Managing your own subscriptions is nice because you have full control, but with it comes the complexity of doing all the commerce etc… Also by doing the commerce outside of the store you are not giving MS the 30% cut from your app sales.

I am really interested to see what other ISVs now pull the trigger and start offering apps in the store now we have this capability.  I suspect the big ISVs wont be using the store commerce model anyway … but i would expect to see some smaller ISVs coming to the party soon. 


Super easy source control – Using Git and Dropbox together

I have been a big fan of Git source control for a while now.  It’s got great momentum, loads of support on various platforms and plenty of tools to help with using it.

For those not aware of Git i suggest taking a read of the Wikipedia article as i am not going to cover the basics of what a source control system is, or in Git’s case a distributed version control system.

I like Git … it is super simple. There is no server and no SQL database etc… Its all just on the file system.

As a Windows user I recommend the Git Extensions tools. It gives you a GUI tool that lets you do almost everything with your Git repository.

If you are working with someone else on some source there are a couple of strategies for using Git as i understand things (please let me know if there are other better options)

  1. Create Patches and send those to others (say in an email)
  2. Create a central repository and Push/Pull changes to that repo that everyone has access to. is an example of 2. above that a lot of people will recognize. Another, less well known option is Microsoft’s Team Foundation Service (TFS) Online.  Both allow you to create and interact with repositories remotely.  In a reasonably simple manner.  You have to pay for private repos on GitHub and TFS is free for up to 5 users currently with more pricing options coming soon.

1. above seems hard to me. Constantly managing patches etc… eeek

However, for two or three person projects I wanted an even simpler option, so i started looking at using Dropbox to host the central repo. The idea is that team members push/pull from a repo that is in a Dropbox folder and that is replicated between machines. This means no additional setup to do with authentication with those services which i think makes life easier.

There is a drawback that you need to be aware of!  You really want to avoid people pushing changes to the central repo at the same time. This can lead to Dropbox getting confused and corrupting the repo. However, if you IM people and make sure it’s all clear then you should be good to go.  This is really only suitable for a two or three person team. Get a GitHub account if you need something more robust 🙂

Here is how it looks:


The central repo doesn’t have a “working directory” (a copy of the source that you can work on). It’s just the Git repo.  You can’t edit files etc… in this location. The Central repo is synced between, in this case, two peoples machines (there is also a copy in the Dropbox cloud).

Each person “clones” the central repo to a local working location on their machine. They do this as a personal repo so that they get a working directory. This will allow them to edit and modify files etc… They can also “commit” (aka checkin) changes to their personal repo.

When they are ready they Push commits from their personal repo to the central repo.  This then replicates the changes to others.

This might sound complex, but if you are a Git user and familiar with the tools its pretty easy to set up.

Using Git Extensions you do this as follows:

First create a new Dropbox folder:


Open Git Extensions and create a new central repo like this in your new folder in Dropbox:


Now go and create your personal Local repo by cloneing the central repo … but do it to a local directory (non-Dropbox)


Then you can open your Local repo and work your coding monkey magic in there. Drop in a file for example (note the irony of the filename i am using here and the file extension … i don’t like JS).


Commit that sucker and you have made your first legit change to the code.

At this point your buddy cant see your stuff. You have been working locally.  So now you need to Push your commit to the central repo.


Notice that your central repo is already set up as the “origin” remote location.  Push to that and you will notice some files start syncing in Dropbox.


Now your central repo is all syncing and dancing your counterpart can join the Dropbox folder and do the same “clone” process you did to create a local repo that they can work in.

I got the idea for using this method a while back from this stackoverflow thread:  It shows you how to do this same process via the Git command line.


SharePoint Saturday Redmond – Building solutions with the future in mind

I had a great time presenting at the SharePoint Saturday Redmond event today at the Microsoft Conference Center in Redmond.

My session was aimed at asking people to think about building and architecting solutions today while keeping the future in mind … even if you don’t have any plans to move to the cloud in the near future.

Even if you are not building cloud apps or solutions today this is still relevant. Get code out of your SharePoint processes and into a loosely coupled architecture. Your servers will be safer, faster and will remain up more often.

Here are the slides.

Erica Toelle was kind enough to take some notes during the session:

Launching SharePoint Apps, launch sequence, debugging and AppRedirect.aspx

Writing SharePoint Apps can be tricky at times. Especially until you learn to know where to look when things are not working for you. Usually the first issue people strike is trying to launch their app and it not working.

Usually one of the first errors people strike is a common one that has an error that says “The parameter ‘token’ cannot be null or empty string”.


This is a signal that something went wrong launching the application and that some of the information that is passed to SharePoint during the app launch wasn’t passed correctly.  Usually due to an app setup issue.

The particular error above could be caused by a variety of issues, however the point of this post is to show you the first place I look when trying to debug it.  Hopefully this will help others find and fix their problem too.

First, to understand where to look when debugging its important to understand the process of what happens when you are launching a SharePoint App. The basic steps of the launch process are:

  1. User clicks the app to launch it
  2. SharePoint sends the user to the AppRedirect.aspx telling it what app to launch
  3. AppRedirect.aspx builds a set of launch parameters including a context token and other parameters about the SharePoint site launching the app e.g. SPSiteUrl
  4. AppRedirect.aspx renders a page with these parameters as a hidden HTML form on the page
  5. Javascript runs that submits the form to the applications launch page, POSTing these form parameters along with some query string parameters
  6. The App page is called and can get these parameters from the Request and use them as it needs. e.g. for using the CSOM to call back to SharePoint.

The error shown above is a snippet of boilerplate code from a provider hosted app code template in Visual Studio.  In order to construct the CSOM context the classes in TokenHelper.cs that are used need some of the information passed in the POST parameters from AppRedirect.aspx.  These are not on the query string as they are POST parameters, so this is usually where people get stuck debugging.  You need to use a tool like Fiddler to see the POST being made and take a look at it.

Below is what the launch sequence looks like in Fiddler (click on the image to get a bigger version):


  1. AppRedirect.aspx is called and the launch parameters are constructed and the Form is submitted by the browser.
  2. The app page is called, in this case Default.aspx
  3. My app wants to make a call using the CSOM to SharePiont and as part of that needs an OAuth Access Token, so is reuqesting one from Azure Access Control Services (ACS) (topic for another post).
  4. Finally the CSOM query is processed and the call to ProcessQuery is made.

In the browser you will recognize the AppRedirect.aspx page by the infamous “Working on it…” text 🙂

If you open #1 above in Fiddler in the raw response to the browser you will see the following:


This is the HTML Form with the launch parameters included. 

If the SPAppToken is empty … you have a launch problem 🙂

If your form is missing the SPAppToken then you will get the error message at the beginning of those post. It’s got the context token that you need in order to use the CSOM.

If it’s empty in the Form you likely have an error message in the SPErrorInfo giving your some more information about why is null/empty.

SPErrorInfo is your first place to start looking for why things are not working.

This should provide some insight and more pointers on where to look next.

E.g. “The Azure Access Control service is unavailable”.   This message says that SharePoint can’t construct the SPAppToken because it cant talk to ACS in order to do that.  This could be for a variety of reasons, such as network connectivity.

This post wasn’t written to solve every problem you have with launching an app.  It was posted to help those trying to debug their apps.  I thought it would be useful for those wanting tips on where to look to find some information when they are having issues launching their app.

I hope to helps!


Sharepoint Provider Hosted Apps in Azure monitoring tip

One of the tips I gave to during my session at TechEd North America this year was about using SignalR in your SharePoint provided hosted applications in Azure.

One of the pain points for developers and people creating provider hosted apps is monitoring them when they are running in the cloud. This might be just to see what is happening in them, or it might be to assist with debugging an issue or bug.

SignalR has helped me A LOT with this.  It’s a super simple to use real time messaging framework. In a nutshell it’s a set of libraries that let you send and receive messages in code, be that in JavaScript or .Net code.

So how do I use it in SharePoint provider hosted apps in Azure to help me monitor and debug?

A SharePoint Provider Hosted App is essentially a web site that provides parts of your app that surface in SharePoint through App Parts or App Pages etc… It’s a set of pages that can contain code behind them as any regular site does.  It’s THAT code that runs that I typically want to monitor while its running in the Azure (or anywhere for that matter).

So how does this work with SignalR?

SignalR has the concept of Hubs that clients “subscribe” to and producers of messages “Publish” to.  In the diagram below App Pages code publish or produce messages (such as “there was a problem doing XYZ”) and consumers listen to a Hub and receive messages when they are published.


In the example I gave at TechEd I showed a SharePoint Provider Hosted App deployed in Azure that Published messages whenever anyone hit a page in my app.  I also created a “Monitor.aspx” page that listened to the Hub for those messages from JavaScript and simply wrote them to the page in real-time.

How do you get this working? It’s pretty easy.

Part 1: Setting up a Hub and publishing messages

First add SignalR to your SharePoint Provider Hosted app project from Nuget. Click the image below for a bigger version showing the libraries to add.


Then in your Global.asax.cs you need to add a Application_OnStart like this.  It registers SignalR and maps the hub urls correctly.

protected void Application_Start(object sender, EventArgs e)
    // Register the default hubs route: ~/signalr

Note: You might not have a Global.asax file in which case you will need to add one to your project.

Then you need to create a Hub to publish messages to and receive them from.  You do this with a new class that inherts from Hub like this:

public class DebugMonitor : Hub
    public void Send(string message)

This provides a single method called Send that any code in your SharePoint Provider Hosted app can call when it wants to send a message. I wrapped this code up in a short helper class called TraceCaster like this:

public class TraceCaster
        private static IHubContext context = GlobalHost.ConnectionManager.GetHubContext<DebugMonitor>();
        public static void Cast(string message)

This gets a reference to the Hub called “context” and then uses that in the Cast method to publish the message.  In code i can then send a message by calling:

TraceCaster.Cast(“Hello World!”);

That is all there is to publishing/sending a simple message to your Hub. 

Now the fun part … receiving them 🙂

Part 2: Listening for messages

In my app I created a new page called Monitor.aspx. It has no code behind, just client side JavaScript. In that code it first references some JS script files: JQuery, SignalR and then the generic Hubs endpoint that SignalR listens on.

<script src=”/Scripts/jquery-1.7.1.min.js” type=”text/javascript”></script>
<script src=”/Scripts/jquery.signalR-1.1.1.min.js” type=”text/javascript”></script>
<script src=”/signalr/hubs” type=”text/javascript”></script>

When the page loads you want some JavaScript that starts listening to the Hub registers a function “addMessage” that is called when the message is sent from the server.

$(function () {
    // Proxy created on the fly         
    var chat = $.connection.debugMonitor;

    // Declare a function on the chat hub so the server can invoke it         
    chat.client.addMessage = function (message) {

        var now = new Date();
        var dtstr = now.format(“isoDateTime”);

        $(‘#messages’).append(‘[‘ + dtstr + ‘] – ‘ + message + ‘<br/>’);

    // Start the connection
    $.connection.hub.start().done(function () {
        $(“#broadcast”).click(function () {
            // Call the chat method on the server

This code uses the connection.hub.start() function to start listening to messages from Hub.  When a message is sent the addMessage function is fired and we can do whatever we like with it. In this case it simply adds it to an element on the page.

All going well when you are running your app you will be able to open up Monitor.aspx and see messages like this flowing:


If you don’t see messages flowing you probably have a setup problem with SignalR.  The most common thing I found when setting this up was the client not being able to correctly reference the SignalR JS or Hub on the server.  Use the developer tools in IE or Chrome (or Fiddler) to check that the calls being made to the server are working correctly (see below for what working should look like):


If you are sitting there thinking “What if I am not listening for messages? What happens to them?” I hear you say!  Well, unless someone is listening for the messages they go away. They are not stored. This is a real-time monitoring solution. Think of it as a window into listening what’s going on in your SharePoint Provider Hosted app.

There are client libraries for .Net, JS, iOS, Android too. So you can publish and listen for messages on all sorts of platforms.  Another application i have used this on is for simple real time communication between Web Roles in Azure and Web Sites in Azure.  SignalR can use the Azure Service Bus to assist with this and its pretty simple to set up.


I’m an developer from way back when debugging meant printf.  Call me ancient but I like being able to see what is going on in my code in real time. It just gives me a level of confidence that things are working the way they should.

SignalR coupled with SharePoint Provided Hosted Apps in Azure are a great combination.  It doesn’t provide a solution for long term application logging, but it does provide a great little realtime windows into your app that I personally love.

If you want to learn more about SignalR then I suggest you take a look at where you will find documentation and videos on other uses for SignalR.  It’s very cool.

Do I use it in production? You bet!  I use it in the backend of my Windows Phone and Winodows 8 application called My Trips as well as in SharePoint Provider Hosted Apps in Azure. Here is a screen shot from the My Trips monitoring page, I can watch activity for various devices registering with my service etc…


Happy Apping…


SPChat transcript: App Model with Chris Johnson

On Wednesday last week I had the pleasure of participating in my first SPChat over on the site.  These are online Q&A based chats where an “expert” is invited and questions are fielded in real time from the live audience.  Each chat has a topic, but anything goes within that topic.

My topic was about the new SharePoint application model for developers is SharePoint 2013 and Office 365. It was really fun fielding questions, but i also felt like my typing was slow and it was hard for me to temper my answers with knowing that i wanted to get as many questions answered as possible.

Questions ranged from the app store submission process to app domain isolation, to tools i use in app development and the financial side of apps.  All interesting topics.

You can read a full transcript online here:

I hope to do another one some day!


The importance of APIs

This week was interesting for owners of Nissan Leaf all electric vehicles.  Nissan has apps for iPhone, Android and Blackberry that let owners see the battery level of their car, turn on and off charging and set the climate control inside their car.  For Windows Phone users (and any other platform like Windows 8 and any non-officially supported platform) 3rd party apps like LEAF Commander were an important to us.

Then a week or so ago Nissan said the following about the ability to use these apps:

“for the privacy and security of our owners, Nissan will be removing that capability” [op: ability to use 3rd party apps]

This of course went down like a cup of cold puke with owners. Especially so in the Seattle, Bellevue and Redmond area where Microsoft is king and Windows Phone reins supreme.  LEAF Commander is being used by > 2000 active Leaf owners worldwide. So the network of Leaf owners erupted and i am sure the poor people at the Nissan customer support desk got an earful from more than a few disgruntled people.

It must have got their attention as today I got an email back from Nissan saying:

Based on initial customer feedback, we understand how important this connectivity is to LEAF drivers, and we will delay taking this action while we further study other potential solutions and explore ways to keep customer data secure.

It’s a shame it’s just a delay … but hopefully this results in a better thought out decision.

So what does this have to do with APIs? 

As I understand it there isn’t a well documented and thought out API for talking to the Nissan system.  These apps have users type their usernames and passwords into the apps themselves & i am guessing that is what Nissan were unhappy about.  The reason being that if you give your username and password to an App then it could send it or save it and use it for whatever it liked.  There would be no way to tell if it was a real user or an app that was calling the Nissan service.  This would be like giving someone your username and password. 

This is the reason why may services on the internet like Twitter, Facebook, Flickr, TripIt and Office 365 included use OAuth to broker allowing/trusting an app to make calls to a service on a users behalf.  The user never gives the app their username and password, they just authorize an app and tell the services that it’s ok for that app to do certain things on their behalf.  When they no longer are ok with that they simply revoke that apps access. In Facebook that screen looks like this:


I would dearly love to see Nissan open up an API that any app developer could use and that was authorized with something like OAuth.  This would left other developers build on their APIs, offer their customers great app experiences and most importantly build customer satisfaction.  I use a lot of 3rd party app for services like Twitter because i think they are better and offer me more than the official ones.  The same goes for the Nissan apps (CarWings is the apps name if you want to check it out).  They are ok … but not GREAT! I am sure other developers could do better & add other things alongside the basic stuff that Nissan have no interest in.

Take my experience with Trip It for example.  I make an app called My Trips for Windows Phone and Windows 8 that gives people a better TripIt experience than the official app on Windows Phone and an alternative to using the browser on Windows 8 (offline etc..)  TripIt have embraced others building apps on their service. There are loads of apps out there that integrate with TripIt, sync trips back and forth, integrate between systems etc… Its a pretty robust ecosystem. And they get a lot of credit for allowing it!

In summary I think this short story novel helps illustrate the importance of having APIs. One company went about poorly supporting their customers (via a no API option) and others are embracing it and thriving (TripIt, 365 etc…). I would dearly like to see Nissan learn from this and offer a secure and robust API that we can build great experiences over.  Hopefully they wont just delay the decision, they will hopefully build out a real API and method of accessing it!

In this world of connected devices and services I can imagine this only becoming more important and one where companies that get it right thrive and those that don’t fail.

OK OK I secretly want to be able to control my car from my watch … come on MS and sell a surface watch and Nissan with a decent API!


SharePoint app tools in Visual Studio 2013 preview, the new SharePointContext helper!

imageToday at the build conference  the new preview of Visual Studio 2013 was released.  Along with it there were some nice advances in the tools for building SharePoint and Office applications also.

You can read the full post about the tools here.

The most interesting of these is the new out of the box project template option for creating a ASP.Net MVC based application.  It asks you during the project creation wizard for provider hosted and autohosted apps.


When you create a totally out of the box app using the wizard and picking this setting you get a very familiar looking app if you have ever create an ASP.Net MVC app before.


Now that is pretty handy being able to create these via the wizard, but what is even cooler is a particular improvement to some of the additional helper classes you get in the new template.

Namely the SharePointContext class.

This is a new class that is added in the new out of the box template is helps you manage your SharePoint context across page requests.


When SharePoint “starts” an app i.e. when a user launches an app, SharePoint packs up some information about that user and passes it along to the app as a POST parameter. This is called the ContextToken and it contains OAuth tokens/information that you need in order to make calls back to SharePoint. Now the trick is that SharePoint passes it to your app when it launches and then it is up to you to do something like cache it so that in subsequent page requests your app has that context and can reuse it.  The basic auth token in it is good for 12 hours & it also contains a refresh token that can be used to get new auth tokens for up to 6 months.

The first question people that are new to SharePoint app development ask me is “my app breaks on the second page request, why?” … it is usually because they don’t realize its totally up to them to cache these very important tokens.

The new MVC template in the VS 2013 preview includes the SharePointContext class that helps with this whole process & gives you an out of the box experience that doesn’t totally suck.

In a new project you will see a line of code that looks like this:

var spContext = SharePointContextProvider.Current.GetSharePointContext(HttpContext);

This is creating a new context object & initializing it using all that good information passed as a POST parameter and some querystring parameters too.

Then you can use that context to get Client Side Object Model (CSOM) instances really easily like this:

using (var clientContext = spContext.CreateUserClientContextForSPHost())
    if (clientContext != null)
        spUser = clientContext.Web.CurrentUser;

        clientContext.Load(spUser, user => user.Title);


        ViewBag.Message = spUser.Title;

Now this is not all that different from what you used to do on the first request to your SharePoint app.  TokenHelper helped you create a CSOM instance from the request details. However the SharePointContext goes further.

When you make the GetSharePointContext(HttpContext) call the class checks the ASP.Net Session state for an existing context. If it doesn’t find one then it creates a new one based on the information passed and then stashes it in Session state for subsequent requests.

Then on subsequent requests, when SP hasn’t passed any new tokens, the GetSharePointContext(HttpContext) will return you back the context from Session state. It will also deal with when and if tokens expire.

(Update 6/28 – added this paragraph)
Additionally the provided [SharePointContextFilter] attribute on controller actions ensures a context is passed from SharePoint. If not it will redirect the user back to SharePoint to obtain one.  Why is this important?  Well, if someone bookmarks your app then when they use that bookmark the context wont be passed & in that case you need to bounce them via SharePoint to go and get one for the first request.  The [SharePointContextFilter] automates that for you.  This is only available in the MVC project however. Very handy indeed not having to wire up this flow yourself!

You don’t need to worry about writing any of this however, as it is all done for you in the helper class.  Go take a look in the GetSharePointContext(HttpContextBase httpContext) method if you are interested in seeing how it works & more importantly run your app and set through the code so you can see how it runs and works.

Once you have a SharePointContext object you can take a look in it and find the CacheKey (used for uniquely identifying this in the cache), ContextToken (for making calls back to SP), RefreshToken (for getting more access tokens when they expire) and other properties it stashes for you like the site language, SP product version etc…


The way in which it is structured is also very friendly for replacing if you want to roll a different caching technique.  You could replace the existing implementation or create your own SharePointContextProvider class that managed the caching.

The library comes with implementations for both on-prem and cloud scenarios:

  • SharePointAcsContextProvider – Office 365
  • SharePointHighTrustContextProvider – On-Prem apps using the high trust S2S auth model


This is simple but very timely additional to the out of the box templates in VS!

Just a few weeks ago at TechEd North America I did a tips and tricks session for app developers and Demo #2 was showing a simpler version of essentially the same thing.  The main difference in the helper class I showed in that demo was that it will work for ASP.Net forms apps as well as MVC (Update 6/27: The newly shipped helper does support ASP.Net Forms based apps too) … however it doesn’t deal with the high-trust S2S scenario for on-prem only apps.

Shout out to @chakkaradeep for the great work on the VS SharePoint tools (a topic near and dear) & for taking the time to watch my TechEd session and let me know they were releasing this helper today!

Update 6/27/2013: Mike Morton did a great session at build yesterday that walks through a whole lot of this. What it here:

To try this out for yourself you will need to get the VS 2013 preview bits:

You will also need a SharePoint site to try it out in and I recommend signing up for a trial Office 365 site here: