Skip to content

backup sql db with powershell

useful post: 

http://msdn.microsoft.com/en-us/library/dn223322.aspx

Set-ExecutionPolicy RemoteSigned  // only needed if unable to import the sql module
import-module sqlps

cd SQLServer:\SQL\COMPUTERNAME $insts= Get-childitem
PS SQLSERVER:\sql\servername> $insts.Databases | Where-object{$_.Status -eq "Normal"} | Backup-sqldatabase

n.b. if script is run without the status filter, it will fail when trying to backup any database which is offline. 

 

 

 

 

Google Analytics API

1. Install Google.Apis.Analytics.v3 Client Library from nuGet

2. Log into the google developer console:

console.developers.google.com/project

setup project. Create new “Service” account

Download the key. Copy the email address

3. Log into google analytics

setup a new user – the email is obtained in step 1 above. 

4. Basic code will be: 

string AuthenticationKey = string.Empty;

var serviceAccountEmail = “xxx@developer.gserviceaccount.com”;

var credential = new ServiceAccountCredential(

new ServiceAccountCredential.Initializer(serviceAccountEmail)
{
Scopes = new[] { AnalyticsService.Scope.AnalyticsReadonly }

}.FromCertificate(certificate));

if (credential.RequestAccessTokenAsync(CancellationToken.None).Result)
{

AuthenticationKey = credential.Token.AccessToken;
}

var gas = new AnalyticsService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = “random variable Insights”,
});

var li = gas.Management.Accounts.List().Execute();

Account ac = li.Items.Where(a => a.Name.Contains(“google analytics name”)).First();
ManagementResource.WebpropertiesResource.ListRequest WebPropertyListRequest = gas.Management.Webproperties.List(ac.Id);
Webproperties WebPropertyList = WebPropertyListRequest.Execute();

var profId = WebPropertyList.Items.Where(et => et.Name == “google analytics name 2″).Select(a => a.DefaultProfileId);

var profStr = string.Format(“ga:{0}”, profId.First().ToString());
var r = gas.Data.Ga.Get(profStr, “2013-01-01″, “2014-01-21″, “ga:pageviews”);

r.Dimensions = “ga:pagePath”;
r.MaxResults = 5;

var results = r.Execute();

 

4. BAM!

 

5. (May need to run as network service 

Resources for learning about building websites

A few useful links related to learning to build websites. 

General

http://www.w3schools.com/

Html

http://html5doctor.com/

http://www.htmlandcssbook.com/

Css

http://learnlayout.com/

Javascript:

http://www.codecademy.com/

http://www.codeavengers.com/

https://www.udacity.com/

Angular:

https://egghead.io/

 

 

 

Twitter oAuth

As advised on the twitter calendar, the API 1.0 stopped working on June 11, 2013.

This is a very rough and ready twitter class to get things working again:

using System;
using System.Web;
using System.Configuration;
using System.Net;
using System.IO;
using System.Text;
using Newtonsoft.Json;
public class TwitterHelper
{
string ConsumerKey;
string ConsumerSecret;
string EncodedConsumerKey;
string EncodedConsumerSecret;
string BearerTokenCredentials;
string Base64EncodedBearerToken;
public TwitterHelper()
{
ConsumerKey = ConfigurationManager.AppSettings["TwitterConsumerKey"];
ConsumerSecret = ConfigurationManager.AppSettings["TwitterConsumerSecret"];

EncodedConsumerKey = HttpUtility.UrlEncode(ConsumerKey);
EncodedConsumerSecret = HttpUtility.UrlEncode(ConsumerSecret);
BearerTokenCredentials = string.Format(“{0}:{1}”, EncodedConsumerKey, EncodedConsumerSecret);
Base64EncodedBearerToken = Convert.ToBase64String(Encoding.UTF8.GetBytes(BearerTokenCredentials));
}
public string RequestToken()
{
WebRequest request = WebRequest.Create(“https://api.twitter.com/oauth2/token”);
string consumerKey = ConsumerKey;
string consumerSecret = ConsumerSecret;
string consumerKeyAndSecret = String.Format(“{0}:{1}”, consumerKey, consumerSecret);
request.Method = “POST”;
request.Headers.Add(“Authorization”, String.Format(“Basic {0}”, Convert.ToBase64String(Encoding.UTF8.GetBytes(consumerKeyAndSecret))));
request.ContentType = “application/x-www-form-urlencoded;charset=UTF-8″;
string postData = “grant_type=client_credentials”;
byte[] byteArray = Encoding.UTF8.GetBytes(postData);
request.ContentLength = byteArray.Length;
Stream dataStream = request.GetRequestStream();
dataStream.Write(byteArray, 0, byteArray.Length);
dataStream.Close();
WebResponse response = request.GetResponse();
using (StreamReader sr = new StreamReader(response.GetResponseStream()))
{
token = JsonConvert.DeserializeObject<TwitterBearerToken>(sr.ReadToEnd().Trim());
return token.access_token;
}
}
public RootObject SearchTweets()
{
if (token == null)
RequestToken();

var url = “https://api.twitter.com/1.1/search/tweets.json?q=%23searchperson&#8221;;
WebRequest request = WebRequest.Create(url);
request.Headers.Add(“Authorization”, String.Format(“Bearer {0}”, token.access_token));
WebResponse response = request.GetResponse();
using (StreamReader sr = new StreamReader(response.GetResponseStream()))
{
return JsonConvert.DeserializeObject<RootObject>(sr.ReadToEnd());
}
}
private TwitterBearerToken _token;
public TwitterBearerToken token
{
get {
return _token;
}
set {
_token = value;
}
}

I need to do a load of things still. Ideally I’ll cache the token (I think the recommended time is 15 minutes).

There is currently no error checking or logging. In particular, I’ll want to check for invalid bearer token, so this needs to be added

Note that the bearer token used here is the simplest way to access the API, and only allows read only access.

Finally I use the excellent tool at : http://json2csharp.com/ to automatically generate c# classes for JSON.

Not sure if this is the best way of doing this, but will no doubt evolve as I include in production code.

Team city build

I’ve been using this build script using a Nant runner within TeamCity recently. I can add %system.config% parameters to part of the build process including steps. This enables me to create a configurations within VS, and add web.config transforms. Check this into a git repository of the same name, and then do a custom depolyment based on that.
Within Team City, I call this file in two seperate steps. The first is the Build, then second is the deployment. The deployment target is “ftp_%system.config%”. This enables me to add additional steps for Unit testing, integration testing, code checking etc.
<?xml version="1.0" encoding="utf-8"?>
<project default="Build" xmlns="http://nant.sf.net/release/0.90/nant.xsd">
  <target name="Build" description="Compiles/Builds the Solution">
    <echo message="Building..." />
    <msbuild project="Web\Web.csproj" failonerror="true" verbose="false">
      <arg value="/p:Configuration=${config};OutputPath=${config}" />
      <arg value="/p:UseWPP_CopyWebApplication=True" />
      <arg value="/p:PipelineDependsOnBuild=False" />
      <arg value="/p:WebProjectOutputDir=..\Output\${config}\" />
      <arg value="/t:Rebuild" />
      <arg value="/nologo" />
    </msbuild>
    <echo message="Building finished..." />
  </target>
  <target name="ftp_release">
    <echo message="Deploying website" level="Debug" />
    <ftpUpload host="waws-prod-am2-001.ftp.azurewebsites.windows.net" username="" password="" todir="site/wwwroot">
      <fileset basedir="Output\${config}\">
        <include name="**" />
      </fileset>
    </ftpUpload>
    <echo message="Website deployed!" level="Debug" />
  </target>
  <target name="ftp_master">
    <echo message="Deploying website" level="Debug" />
    <ftpUpload host="waws-prod-am2-001.ftp.azurewebsites.windows.net" username="" password="" todir="site/wwwroot">
      <fileset basedir="Output\${config}\”>
        <include name="**" />
      </fileset>
    </ftpUpload>
    <echo message="Website deployed!" level="Debug" />
  </target>
  
</project>

Using SQL server express in azure vm.

Setting up SQL server in an azure VM is easy. Connecting to it is not as obvious as it used to be.

The basic syntax is simple:

thedatabaseserver\mydatabase,port.

e.g. company-azure.cloudapp.net\SQLEXPRESS,1433

Getting the SQL  working within a vm needs a little work.

Within the SQL Configuration Manager (powershell -> “start SQLServerManager10″)

SQL Server Network COnfiguration -> Protocols for SQLEXPRESS -> Make sure that TCP/IP is enabled

Then double click on  TCP/IP, and switch to the IP Addresses tab
IP2
Active Yes
Enabled No
IPAddrss 10.xxx.xxx.xxx (not the public ip address)
TCP Dynamic Ports 0
TCP Port {blank}
IPAll
TCP Dynamic Ports {Blank}
TCP Port 1433

Thanks for all the people who did the hard work in the following posts for making this easy.

http://www.windowsazure.com/en-us/manage/windows/common-tasks/install-sql-server/#TCP

http://stackoverflow.com/questions/11278114/enable-remote-connections-for-sql-server-express-2012

http://msdn.microsoft.com/en-us/library/ms177440.aspx

http://rediantosatya.wordpress.com/2012/01/12/how-to-connect-to-a-non-standard-tcpip-port-from-sql-server-management-studio/

 

 

 

Windows 2012 – Azure

A couple of quick things to note here:

From the desktop, press Ctrl + Esc (or the windows key) to bring up the “metro” interface.

To get to IE, type a web address in explorer, or open the metro interface, and click on the IE icon.

If you want to download something, then you need to change the  “IE Enhanced Security Configuration” settings on the dashboard.

To open the windows firewall, open server manager, then look at tools -> Windows Firewall with advanced security. Or in powershell type “start firewall”.

To get to environmental variable: Open Explorer -> Right click on “Computer”, and select properties -> Select Advanced system settings. Though to list them you can just open powershell, and type “gci env:”

To start iis manager. In powershell type “start inetmgr”.

 

 

 

Follow

Get every new post delivered to your Inbox.