Skip to content

Universal Analytics

The syntax has changed for the new version of google analytics.

This shows examples

backup sql db with powershell

useful post:

Set-ExecutionPolicy RemoteSigned  // only needed if unable to import the sql module
import-module sqlps

cd SQLServer:\SQL\COMPUTERNAME $insts= Get-childitem
PS SQLSERVER:\sql\servername> $insts.Databases | Where-object{$_.Status -eq "Normal"} | Backup-sqldatabase

n.b. if script is run without the status filter, it will fail when trying to backup any database which is offline. 





Google Analytics API

1. Install Google.Apis.Analytics.v3 Client Library from nuGet

2. Log into the google developer console:

setup project. Create new “Service” account

Download the key. Copy the email address

3. Log into google analytics

setup a new user – the email is obtained in step 1 above. 

4. Basic code will be: 

string AuthenticationKey = string.Empty;

var serviceAccountEmail = “”;

var credential = new ServiceAccountCredential(

new ServiceAccountCredential.Initializer(serviceAccountEmail)
Scopes = new[] { AnalyticsService.Scope.AnalyticsReadonly }


if (credential.RequestAccessTokenAsync(CancellationToken.None).Result)

AuthenticationKey = credential.Token.AccessToken;

var gas = new AnalyticsService(new BaseClientService.Initializer()
HttpClientInitializer = credential,
ApplicationName = “random variable Insights”,

var li = gas.Management.Accounts.List().Execute();

Account ac = li.Items.Where(a => a.Name.Contains(“google analytics name”)).First();
ManagementResource.WebpropertiesResource.ListRequest WebPropertyListRequest = gas.Management.Webproperties.List(ac.Id);
Webproperties WebPropertyList = WebPropertyListRequest.Execute();

var profId = WebPropertyList.Items.Where(et => et.Name == “google analytics name 2″).Select(a => a.DefaultProfileId);

var profStr = string.Format(“ga:{0}”, profId.First().ToString());
var r = gas.Data.Ga.Get(profStr, “2013-01-01″, “2014-01-21″, “ga:pageviews”);

r.Dimensions = “ga:pagePath”;
r.MaxResults = 5;

var results = r.Execute();


4. BAM!


5. (May need to run as network service 

Resources for learning about building websites

A few useful links related to learning to build websites.






General Learning:


Twitter oAuth

As advised on the twitter calendar, the API 1.0 stopped working on June 11, 2013.

This is a very rough and ready twitter class to get things working again:

using System;
using System.Web;
using System.Configuration;
using System.Net;
using System.IO;
using System.Text;
using Newtonsoft.Json;
public class TwitterHelper
string ConsumerKey;
string ConsumerSecret;
string EncodedConsumerKey;
string EncodedConsumerSecret;
string BearerTokenCredentials;
string Base64EncodedBearerToken;
public TwitterHelper()
ConsumerKey = ConfigurationManager.AppSettings[“TwitterConsumerKey”];
ConsumerSecret = ConfigurationManager.AppSettings[“TwitterConsumerSecret”];

EncodedConsumerKey = HttpUtility.UrlEncode(ConsumerKey);
EncodedConsumerSecret = HttpUtility.UrlEncode(ConsumerSecret);
BearerTokenCredentials = string.Format(“{0}:{1}”, EncodedConsumerKey, EncodedConsumerSecret);
Base64EncodedBearerToken = Convert.ToBase64String(Encoding.UTF8.GetBytes(BearerTokenCredentials));
public string RequestToken()
WebRequest request = WebRequest.Create(“”);
string consumerKey = ConsumerKey;
string consumerSecret = ConsumerSecret;
string consumerKeyAndSecret = String.Format(“{0}:{1}”, consumerKey, consumerSecret);
request.Method = “POST”;
request.Headers.Add(“Authorization”, String.Format(“Basic {0}”, Convert.ToBase64String(Encoding.UTF8.GetBytes(consumerKeyAndSecret))));
request.ContentType = “application/x-www-form-urlencoded;charset=UTF-8″;
string postData = “grant_type=client_credentials”;
byte[] byteArray = Encoding.UTF8.GetBytes(postData);
request.ContentLength = byteArray.Length;
Stream dataStream = request.GetRequestStream();
dataStream.Write(byteArray, 0, byteArray.Length);
WebResponse response = request.GetResponse();
using (StreamReader sr = new StreamReader(response.GetResponseStream()))
token = JsonConvert.DeserializeObject<TwitterBearerToken>(sr.ReadToEnd().Trim());
return token.access_token;
public RootObject SearchTweets()
if (token == null)

var url = “;;
WebRequest request = WebRequest.Create(url);
request.Headers.Add(“Authorization”, String.Format(“Bearer {0}”, token.access_token));
WebResponse response = request.GetResponse();
using (StreamReader sr = new StreamReader(response.GetResponseStream()))
return JsonConvert.DeserializeObject<RootObject>(sr.ReadToEnd());
private TwitterBearerToken _token;
public TwitterBearerToken token
get {
return _token;
set {
_token = value;

I need to do a load of things still. Ideally I’ll cache the token (I think the recommended time is 15 minutes).

There is currently no error checking or logging. In particular, I’ll want to check for invalid bearer token, so this needs to be added

Note that the bearer token used here is the simplest way to access the API, and only allows read only access.

Finally I use the excellent tool at : to automatically generate c# classes for JSON.

Not sure if this is the best way of doing this, but will no doubt evolve as I include in production code.

Team city build

I’ve been using this build script using a Nant runner within TeamCity recently. I can add %system.config% parameters to part of the build process including steps. This enables me to create a configurations within VS, and add web.config transforms. Check this into a git repository of the same name, and then do a custom depolyment based on that.
Within Team City, I call this file in two seperate steps. The first is the Build, then second is the deployment. The deployment target is “ftp_%system.config%”. This enables me to add additional steps for Unit testing, integration testing, code checking etc.
<?xml version="1.0" encoding="utf-8"?>
<project default="Build" xmlns="">
  <target name="Build" description="Compiles/Builds the Solution">
    <echo message="Building..." />
    <msbuild project="Web\Web.csproj" failonerror="true" verbose="false">
      <arg value="/p:Configuration=${config};OutputPath=${config}" />
      <arg value="/p:UseWPP_CopyWebApplication=True" />
      <arg value="/p:PipelineDependsOnBuild=False" />
      <arg value="/p:WebProjectOutputDir=..\Output\${config}\" />
      <arg value="/t:Rebuild" />
      <arg value="/nologo" />
    <echo message="Building finished..." />
  <target name="ftp_release">
    <echo message="Deploying website" level="Debug" />
    <ftpUpload host="" username="" password="" todir="site/wwwroot">
      <fileset basedir="Output\${config}\">
        <include name="**" />
    <echo message="Website deployed!" level="Debug" />
  <target name="ftp_master">
    <echo message="Deploying website" level="Debug" />
    <ftpUpload host="" username="" password="" todir="site/wwwroot">
      <fileset basedir="Output\${config}\”>
        <include name="**" />
    <echo message="Website deployed!" level="Debug" />

Using SQL server express in azure vm.

Setting up SQL server in an azure VM is easy. Connecting to it is not as obvious as it used to be.

The basic syntax is simple:



Getting the SQL  working within a vm needs a little work.

Within the SQL Configuration Manager (powershell -> “start SQLServerManager10″)

SQL Server Network COnfiguration -> Protocols for SQLEXPRESS -> Make sure that TCP/IP is enabled

Then double click on  TCP/IP, and switch to the IP Addresses tab
Active Yes
Enabled No
IPAddrss (not the public ip address)
TCP Dynamic Ports 0
TCP Port {blank}
TCP Dynamic Ports {Blank}
TCP Port 1433

Thanks for all the people who did the hard work in the following posts for making this easy.





Get every new post delivered to your Inbox.