How to tell if your Azure SQL Stored Procedure is Running

December 8th, 2014

One of the capabilities of the Cotega service is the ability to schedule execution of stored procedures in Azure SQL.  This is handy for many administrators because Azure SQL does not have a SQL Agent capability.  One of the interesting side effects I have seen from this is that long running stored procedures can cause issues when they are scheduled to run frequently.  If the previous execution of the stored procedure had not completed before the next one starts, there can be issues such as locking or even extreme cases where they pile up and eventually cause connection drops due to reaching the maximum number of request counts.

One solution that I have come up with is to add code to the start of a stored procedure that detects if it is already running.  If it is, it simply returns.  It was interesting to see that there was not a lot of information that I could find on the internet to detect if an Azure SQL stored procedure is running, so I decided to build one.  This is what I came up with which will return a count of the number of times a specific stored procedure is currently running:

CREATE FUNCTION [dbo].[CheckStoredProcActive] (@StoredProcnvarchar(255))

RETURNS int

as

begin

declare @spCount int;

set @spCount=(select count(*) FROM sys.dm_exec_sessionss

INNER JOIN sys.dm_exec_requestsr

ON r.session_id=s.session_id

CROSSAPPLY sys.dm_exec_sql_text(r.sql_handle) AS SQL

INNER JOIN sys.objects o

ON SQL.objectid=o.object_id

WHERE  s.is_user_process= 1

AND r.database_id=db_id()

AND o.name=@StoredProc);

RETURN @spCount;

end;

Then you can simply add the following to the start of your procedure where you replace  [ENTER_STORED_PROC_NAME] with your stored procedure name:

if (dbo.CheckStoredProcActive(‘[ENTER_STORED_PROC_NAME]’)> 0)

   return;

I hope others find this helpful.

Liam

How to tell if your Azure SQL database performance issues are data center related

February 20th, 2014

I am the founder of a startup called Cotega and also a Microsoft employee where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

Working on a monitoring service for SQL Azure Azure SQL Database allows me to get a really good idea of the common issues DBA’s have when working with their database.  If I had to choose the most common area of frustration I hear from DBA’s is whether their performance issue is related only to their database or to the entire data center where their database is hosted?  It has been a goal of mine from the very early days of Cotega to help solve this problem.  Today I am happy to announce that within Cotega, you are now able to see your performance data overlayed with that of the data centers performance.    As you can see below, here is an example of connection latency done from outside the Azure data centers.  It shows the connection latency of my database (in blue) along with the average latency of all other customers in this datacenter (in orange).

Azure SQL Database Data Center Performance

 

This has taken me some time to build because in order to get a true picture of the data center you are in, I needed to have enough customers in each of the data centers such that I could get an aggregated average of all the data gathered from the monitoring of these database without exposing any information about any one customers performance data.

As of right now, I have a really good picture of all the major Azure data centers (with the exception of East Asia).  In fact, even without a Cotega account, you can see the status of each of the data centers here.  Please note that this data is related only to that of SQL Database and does not reflect that of other Azure services.

I hope that all of you will consider creating an account to help everyone create an absolutely perfect picture of the health of SQL Azure database data centers.  To learn more about how to create an account, please visit www.cotega.com.

Liam

How to Perform SQL Azure Performance & Diagnostics Analysis for SQL Azure

June 18th, 2013

I am the founder of a startup called Cotega and also a Microsoft employee where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

A fully managed services such as SQL Azure has a number of advantages but there are also a few disadvantages. One of these disadvantages is in diagnosing performance issues. This can be tricky since you do not have access to the machine like you would with an on-premises SQL Server. To date, I have limited Cotega to monitoring of current database issues and then notify users when issues occur.  However, more recently, I have added diagnostics capabilities to allow DBA’s to then drill down and really understand what is causing the issue.  To do this, I use many of the queries outlined below, many of which were gathered from the expertise of the SQL Server and SQL Azure MVP’s.  If you have others that you like to use, I would love to hear from you. For more information on how Cotega does this, please see the following page on Cotega Performance & Diagnostics Analysis.

Performance and Diagnostics Queries for SQL Azure Databases

Top 10 Most CPU intensive queries

SELECT TOP 10 query_stats.query_hash AS \”QueryHash\”,
SUM(query_stats.total_worker_time) / SUM(query_stats.execution_count) AS \”AvgCPUTime\”,
MIN(query_stats.statement_text) AS \”StatementText\”
FROM
(SELECT QS.*,
SUBSTRING(ST.text, (QS.statement_start_offset/2) + 1,
((CASE statement_end_offset
WHEN -1 THEN DATALENGTH(st.text)
ELSE QS.statement_end_offset END
– QS.statement_start_offset)/2) + 1) AS statement_text
FROM sys.dm_exec_query_stats AS QS
CROSS APPLY sys.dm_exec_sql_text(QS.sql_handle) as ST) as query_stats
GROUP BY query_stats.query_hash
ORDER BY 2 DESC

Top 10 Most Resource Intensive Queries
SELECT highest_cpu_queries.plan_handle, highest_cpu_queries.total_worker_time, q.dbid,
q.objectid, q.number, q.encrypted, q.[text]
FROM
(SELECT TOP 10 qs.plan_handle, qs.total_worker_time
FROM sys.dm_exec_query_stats qs
ORDER BY qs.total_worker_time desc) AS highest_cpu_queries
CROSS APPLY sys.dm_exec_sql_text(plan_handle) AS q
ORDER BY highest_cpu_queries.total_worker_time desc”;

Current Connection Info
SELECT e.connection_id, s.session_id, s.login_name, s.last_request_end_time, s.cpu_time
FROM sys.dm_exec_sessions s INNER JOIN sys.dm_exec_connections e ON s.session_id = e.session_id

Top ten Running Queries having the Longest Total Elapsed Time & are Blocking Other Queries
SELECT TOP 10 r.session_id, r.plan_handle, r.sql_handle, r.request_id, r.start_time, r.status,
r.command, r.database_id, r.user_id, r.wait_type, r.wait_time, r.last_wait_type,
r.wait_resource, r.total_elapsed_time, r.cpu_time, r.transaction_isolation_level, r.row_count, st.text
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) as st
WHERE r.blocking_session_id = 0
and r.session_id in
(SELECT distinct(blocking_session_id) FROM sys.dm_exec_requests)
GROUP BY r.session_id, r.plan_handle, r.sql_handle, r.request_id, r.start_time, r.status,
r.command, r.database_id, r.user_id, r.wait_type, r.wait_time, r.last_wait_type,
r.wait_resource, r.total_elapsed_time, r.cpu_time, r.transaction_isolation_level,r.row_count, st.text
ORDER BY r.total_elapsed_time desc

Row Count Aggregate Information (total rows, min rows, max rows and last rows) for Queries
SELECT qs.execution_count,
SUBSTRING(qt.text,qs.statement_start_offset/2 +1,
(CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(nvarchar(max), qt.text)) * 2
ELSE qs.statement_end_offset end –
qs.statement_start_offset
)/2
) AS query_text,
qt.dbid, dbname= DB_NAME (qt.dbid), qt.objectid,
qs.total_rows, qs.last_rows, qs.min_rows, qs.max_rows
FROM sys.dm_exec_query_stats AS qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS qt
WHERE qt.text like ‘%SELECT%’
ORDER BY qs.execution_count DESC;

Top 10 Most Expensive Queries by Logical Reads
SELECT TOP 10 SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(qt.TEXT)
ELSE qs.statement_end_offset
END – qs.statement_start_offset)/2)+1) SQLStatement,
qs.execution_count,
qs.total_logical_reads, qs.last_logical_reads,
qs.total_logical_writes, qs.last_logical_writes,
qs.total_worker_time,
qs.last_worker_time,
qs.total_elapsed_time/1000000 total_elapsed_time_in_S,
qs.last_elapsed_time/1000000 last_elapsed_time_in_S,
qs.last_execution_time,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_logical_reads DESC

Top 10 Most Expensive Queries by Logical Writes
SELECT TOP 10 SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(qt.TEXT)
ELSE qs.statement_end_offset
END – qs.statement_start_offset)/2)+1) SQLStatement,
qs.execution_count,
qs.total_logical_reads, qs.last_logical_reads,
qs.total_logical_writes, qs.last_logical_writes,
qs.total_worker_time,
qs.last_worker_time,
qs.total_elapsed_time/1000000 total_elapsed_time_in_S,
qs.last_elapsed_time/1000000 last_elapsed_time_in_S,
qs.last_execution_time,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_logical_writes DESC

Top 10 Most Expensive Queries by Worker Time
SELECT TOP 10 SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(qt.TEXT)
ELSE qs.statement_end_offset
END – qs.statement_start_offset)/2)+1) SQLStatement,
qs.execution_count,
qs.total_logical_reads, qs.last_logical_reads,
qs.total_logical_writes, qs.last_logical_writes,
qs.total_worker_time,
qs.last_worker_time,
qs.total_elapsed_time/1000000 total_elapsed_time_in_S,
qs.last_elapsed_time/1000000 last_elapsed_time_in_S,
qs.last_execution_time,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_worker_time DESC

Queries Taking Longest Elapsed Time
SELECT TOP 10
qs.total_elapsed_time / qs.execution_count / 1000000.0 AS average_seconds,
qs.total_elapsed_time / 1000000.0 AS total_seconds,
qs.execution_count,
SUBSTRING (qt.text,qs.statement_start_offset/2,
(CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
ELSE
qs.statement_end_offset
END
– qs.statement_start_offset)/2) AS individual_query,
o.name AS object_name,
DB_NAME(qt.dbid) AS database_name
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) as qt
LEFT OUTER JOIN sys.objects o ON qt.objectid = o.object_id
where qt.dbid = DB_ID()
ORDER BY average_seconds DESC;

Queries doing the Most I/O
SELECT TOP 10
(total_logical_reads + total_logical_writes) / qs.execution_count AS average_IO,
(total_logical_reads + total_logical_writes) AS total_IO,
qs.execution_count AS execution_count,
SUBSTRING (qt.text,qs.statement_start_offset/2,
(CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
ELSE qs.statement_end_offset END – qs.statement_start_offset)/2) AS indivudual_query,
o.name AS object_name,
DB_NAME(qt.dbid) AS database_name
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) as qt
LEFT OUTER JOIN sys.objects o ON qt.objectid = o.object_id
where qt.dbid = DB_ID()
ORDER BY average_IO DESC;

Calculate the Database Size (MB)
SELECT SUM(reserved_page_count) * 8192 / 1024 / 1024 as DatabaseSize FROM sys.dm_db_partition_stats

 

Check out the new look of Cotega – SQL Azure Monitoring Service

February 9th, 2013

I am the founder of a startup called Cotega and also a Microsoft employee where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

It has been some time since I have posted an update, but the past few months have been quite busy. One of the things that I have been working on is a major update to the Cotega service. I heard a lot of feedback from people on the v1 version of Cotega and I think one of the biggest things I heard was in the area of ease-of-use. There are a lot of things that people want to be able to monitor for their database and I just made it too hard to add all of the things needed. So I decided to step back and try to come up with a new way to set up your database monitoring. Now, you simply point Cotega to your database, provide an email to send notifications and you are ready to go.

In addition, there is now a “Free” subscription that does not require a credit card and allows you to try out the service before making a commitment. The key difference between the Free plan and a paid one is that monitoring is only done once every 2 hours (as opposed to every 10 min for a paid account). In additions paid accounts will send email notifications when database issues occur.

Cotega Dashboard

How to Host a Website on Azure and Link it to a Domain like .IO

October 6th, 2012

.COM addresses are becoming harder and hard to come by.  It seems each time you look, it is harder to get the name you want.  Many startups have become creative in the way they name their site, for example if you wanted to create a site called score.com, you might try calling it scorable.com or scorability.com.  Another option that seems to be getting more and more popular is to use non .COM domains such as instant.ly or ordr.in or ginger.io.  Many of these domains are easier to get, but the downside for many is that they are typically more expensive and the process of configuring them is often harder.  I want to talk about how I managed to create URL using the io domain which used a site hosted on Windows Azure (where all the html pages are hosted).  I am going to assume you already know how to create a web site on Windows Azure.

Getting the Domain Name

In the past, I have used GoDaddy as the place that I set up most of my domains.  For .COM’s this is incredibly cheap (most of the time you can find a promo code for just about anything) and I have found them to be extremely supportive and easy to work with.  However, in this case I wanted to get a .IO domain which is actually a Indian Ocean domain because the .COM was owned by someone else trying to sell it.  To do this, I used http://nic.io where it cost ~$93 for the year (as opposed to ~$8 for a .COM domain).

DNS Server Configuration

After I purchased the .io domain from nic.io, I learned that they needed to be provided with DNS settings requiring a Primary and Secondary DNS Server.  This was kind of frustrating since GoDaddy provided me with this whenever I created a .COM address.    So after looking a little more I learned that I could just use GoDaddy for this.  In my case, I could just log into my existing GoDaddy account and configure this for no extra charge.  For you, you could either look at another DNS Server provider or create an account with GoDaddy to do this.

GoDaddy DNS Configuration

Within my GoDaddy account, I launched the DNS Manager and chose to create a new Offsite Domain.  This was pretty hidden which is why I am adding a screen shot of where I went.

GoDaddy DNS Configuration

In the Domain Name text box I entered my .IO domain name, clicked Next and copied the 2 DNS servers that were provided.  Copy these as you will need them in a minute.

Create A Record

The next step is to configure the GoDaddy DNS servers so that when a request for my domain .io is requested, it know where to point the user to.  In my case it is a Windows Azure website with a .cloudapp.net name.    To do this config, in the GoDaddy DNS Dashboard you click on “Edit Zone” under the domain you just added.  In the A Host section, choose “Add Record” and for “Host” use @, and for “Points To” add the IP address to your Azure hosted site.

Choose Save

Configure A Record

Add DNS Servers to NIC.io.

The final step is to configure NIC.io so that it knows to use the GoDaddy domain servers you just configured.  Log in to the admin panel for NIC.io and choose to Manage the DNS Settings.  In the “DNS Servers or Mail & Web Forwarding Details”, for the primary server enter the first DNS server you copied from the previous steps and in the Secondary Server enter the second.    If you only have one DNS server, I think that is probably ok.  At the bottom choose “Modify Domain”.

Give it an hour or two

It usually takes an hour or two for these configurations to fully get updated.  However, after that you should be all set to go.

Using Cloud Services to send Notifications with Windows Phone

July 28th, 2012
I am pretty excited because this week one of my other personal side projects called the “Windows Phone Baby Monitor” just won that “2012 Appies People’s Choice Award”, an internal contest for Microsoft colleagues.  What I would like to do in this post is give you some background on how I managed to implement SMS, Phone and Email notifications from Windows Phone for this app.
Baby Monitor for Windows Phone

Background

This application basically turns your Windows Phone into a baby monitor.  You leave the phone near your baby while it is sleeping and then when it detects crying it will send a notification by phone, email or SMS.  There are a number of things that I do within the application to set the decibel level and length of time the crying lasts to avoid false notifications like doors slamming.

Sending Notifications with Windows Phone API’s

When I first started building the application, I figured this would be very simple to implement because I could just write to some sort of Windows Phone API that would allow me to send the SMS, email or phone notifications.  Although there is an API for doing this, the main issue I had was the fact that the user needed to physically click an approve button for the notification to complete.  Since I figured it was somewhat unrealistic to expect a baby to sit up, move to the phone and click a button when it was done sleeping, I needed to come up with another solution.

Using Services to Send Notifications

From my work with Cotega (which is a monitoring service for SQL Azure databases), I had a little background experience on how to send notification.  For that service, I used Amazon’s SES  to send email notifications.  For this Baby Monitor, I chose to use a different service called SendGrid to send email notifications.  I used this service over Amazon SES because I wanted to compare the two and also because there was a great offer of 25,000 free emails / month for Azure services.  Since SendGrid does not support sending SMS or making phone calls, for this part I chose to use Twilio.

SendGrid

SendGrid was incredibly easy to implement within my MVC service.  The way it works with the Baby Monitor is that when the phone detects crying, it makes a WebClient request to my MVC service, passing some details like the Windows Phone device id, decibel level and the email address that needs to be contacted.  From that point the service makes a SendGrid request to send the email.  As you can see, the controller code is pretty simple and looks like this.

using SendGridMail;
using SendGridMail.Transport;
// Create the email object first, then add the properties.
SendGrid myMessage = SendGrid.GenerateInstance();
myMessage.AddTo(emailAddress);
myMessage.From = new MailAddress("\"Cotega Baby Monitor\" ", "Cotega Baby Monitor");
myMessage.Subject = "Baby Monitor Alert!";
myMessage.Text = "Crying was detected by the Baby Monitor.  Decibel Level: " + maxDecibels + ".  ";

// Create credentials, specifying your user name and password.
var credentials = new NetworkCredential("[myusername]", "[MyPassword]");

// Create an REST transport for sending email.
var transportREST = REST.GetInstance(credentials);

// Send the email.
transportREST.Deliver(myMessage);

Twilio for Sending SMS and Phone Notifications

This part was a little more complex because although sending notifications through Twilio is pretty cheap within North America, I still needed a mechanism to limit the ability for people to send unlimited notifications.  For this, I chose to implement a token based system that was linked to the device id of the phone.  When they download the app, I give them a certain number of tokens to send SMS and make phone calls (emails are free), and if they wish to purchase more they can do so and tokens are applied to their account.  There are some really cool things about Twilio such as:

  • Text to Voice for phone calls:  This allows me to send some text to the Twilio service such as “Your baby is crying” and when Twilio makes a call to the the person’s phone, the person hear’s a computer like voice stating “Your baby is crying”.
  • Attaching audio files to phone calls:  When I call the Twilio service, I can pass it a link to a URL that contains an audio file.  When Twilio makes the call to the person, the audio file can be played over the phone.  This is really useful because I can attach a snippet of audio of their baby crying to allow them to ensure that it is really their baby crying and not a jack hammer in the background or something else.

Just like SendGrid, the code used in the MVC controller is really very simple using the TwitML api.  Here is what it looks like for sending SMS messages.

using Twilio;
using Twilio.TwiML;
//twilioAccountSID and twilioAuthToken values are store in the web.config (hopefully encrypted)
public string twilioAccountSID = ConfigurationManager.AppSettings["twilioAccountSID"];
public string twilioAuthToken = ConfigurationManager.AppSettings["twilioAuthToken"];
//This it the controller code that sends the SMS message
var twilio = new TwilioRestClient(twilioAccountSID, twilioAuthToken);
string smsMessageBody = "Baby Monitor Alert! Crying detected at decibel Level: " + maxDecibels + ". ";
if (fileName != null)
    smsMessageBody += "http://cotega.com/audio/" + fileName + ".wav";
// if phone number is 10 digits I need to add the +1
if (phoneNumber.Length == 10)
    phoneNumber = "+1" + phoneNumber;
var msg = twilio.SendSmsMessage("+1[mytwilionumber]", phoneNumber, smsMessageBody);

There is a little bit of code that I have not show before and after that first checks a SQL Azure database to see if they have enough tokens to send a notification, and then updates their token count after the message is sent.

The code for making the phone call is very similar.

// Create an instance of the Twilio client.
TwilioRestClient client;
client = new TwilioRestClient(twilioAccountSID, twilioAuthToken);

// Use the Twilio-provided site for the TwiML response.
String Url = "http://twimlets.com/message";
Url = Url + "?Message%5B0%5D=" + "Hello.%20Crying%20was%20detected%20by%20the%20baby%20monitor.";
if (fileName != null)
    Url += "&Message%5B1%5D=" + "http://www.cotega.com/audio/"+fileName+".wav";

// Instantiate the call options that are passed to the outbound call
CallOptions options = new CallOptions();

// Set the call From, To, and URL values to use for the call.
// This sample uses the sandbox number provided by
// Twilio to make the call.
options.From = "+1[my twilio number]";
options.To = "+1" + phoneNumber;
options.Url = Url;

// Make the call.
Call call = client.InitiateOutboundCall(options);

Notice how I can attach a string of text along with a url to http://twimlets.com/message. For example, try clicking this link and see the XML that is created.  If you passed this to Twilio it would make a phone call and say “Hello from Liam”.  Notice also, how I attached a link to a WAV file.  Twilio will take that audio and play it when the person answers the phone.  Very cool right?

As I mentioned, I did not go into many details of how I linked the Windows Phone device ID to the service to allow me to verify they have enough tokens before making the call, but if you are interested in this, let me know and I can help you get going.

Liam

How to use PayPal with ASP.NET MVC

June 12th, 2012

I talked previously about why I chose to use Stripe with Cotega to allow me to accept credit card payments. In it I talked about how excellent their support was and how easy it was to take their samples and implement it into my service. I am still extremely pleased with my choice of using Stripe and I have no intention of moving away from it. I have however always wanted to investigate using PayPal as an alternative payment option because occasionally International customers will have credit cards that are not acceptable by Stripe or they just plain prefer to use PayPal. For this reason, I decided to take some time and understand what it would take to implement PayPal into my MVC based service.
Unfortunately, I did not find this to be quite as simple to implement as it was with Stripe. Although PayPal has a number of examples, none of the ones I could find were targeted for MVC and there did not seem to be any good tutorials on this subject to get me started. I hope this blog post will help anyone thinking to implement PayPal with MVC. I do want to caveat this with the fact that I am far from a PayPal expert and I do not claim that this is the “PayPal preferred” way to implement this other than to say that I have tried to convert other examples as best I can and it has worked well for me so far. If you have any comments or suggestions, I am very interested to hear your feedback.

Step 1 – Choosing PayPal IPN vs. PDT vs. Express

One of the first things I needed to decide on was the appropriate “PayPal Responder” to use. A responder is like a callback that PayPal does when the transaction is complete (successful, unsuccessful, invalid, etc.). It can also return custom variables that you set when the user first clicks the “Pay Now” button.
Since we are using MVC, we first need to create a View that initiates the transaction to PayPal. This is where the user clicks the “Pay Now” button and is re-directed to PayPal where they enter their credit card or PayPal account information. After this is complete, PayPal can initiate a responder which calls your service with the results of the transaction. The flow of the process looks something like this:

PayPal payment process

You really only need to use a responder in the cases where you need to continue with some process after that transaction completes. For example, in my case I want to be able to enable a user subscription once the transaction completes. If you are simply accepting payments, there may be no need to go to the complexity of using responders.
There is lots of information on responders but I found that Instant Payment Notifications (IPN) was the best choice for me, and probably yourself as well if you simply want a callback from PayPal to your MVC controllers.

Step 2 – Implementing the View and Form

The next step is to create a payment page using a View where the user can choose to start the PayPal process. Here is an example of a HTML form that I used.

@*
*@

 

There are a couple of things you should notice in this form:
• The commented out section allows me to either submit the payment request to the PayPal sandbox (test) environment or to the actual live PayPal url.
• Notice all the input type=”text” types. These are values that are returned to my controller after the transaction is complete. When I load the page, I set the “device-id” input type to a custom value that I want PayPal to return when the transaction completes. I can use this value to update the user’s status in my SQL database (assuming the transaction is successful).
• The “return” input type URL defines where the user will be redirected to after the transaction is complete. It is important that you don’t rely on this as a method for completing a transaction because most users will never click on this link. Don’t forget to create this page.
• The notify_url input is the IPN URL that PayPal will call after the transaction completes.
• Remember to associate the business input type email address with the PayPal address you used for your sandbox or live account
• There are a lot more custom values you can use and learn more about here.

Step 3 – Creating an IPN Controller in MVC

Next we will want to create a new ActionResult controller that will receive the responder request from PayPal after the transaction completes. I want to give credit to this page where I was able to leverage a lot of code for this controller. You should also remember to actually create an IPN view (IPN.cshtml) page because PayPal will continue to call this controller until it gets a successful page returned.

public ActionResult IPN()
{
// Receive IPN request from PayPal and parse all the variables returned
var formVals = new Dictionary();
formVals.Add("cmd", "_notify-validate");

// if you want to use the PayPal sandbox change this from false to true
string response = GetPayPalResponse(formVals, false);

if (response == "VERIFIED")
{
string transactionID = Request["txn_id"];
string sAmountPaid = Request["mc_gross"];
string deviceID = Request["custom"];

//validate the order
Decimal amountPaid = 0;
Decimal.TryParse(sAmountPaid, out amountPaid);

if (sAmountPaid == "2.95")
{
// take the information returned and store this into a subscription table
// this is where you would update your database with the details of the tran

return View();

}
else
{
// let fail - this is the IPN so there is no viewer
// you may want to log something here
}
}

return View();
}

string GetPayPalResponse(DictionaryformVals, bool useSandbox)
{

// Parse the variables
// Choose whether to use sandbox or live environment
string paypalUrl = useSandbox ? "https://www.sandbox.paypal.com/cgi-bin/webscr"
: "https://www.paypal.com/cgi-bin/webscr";

HttpWebRequest req = (HttpWebRequest)WebRequest.Create(paypalUrl);

// Set values for the request back
req.Method = "POST";
req.ContentType = "application/x-www-form-urlencoded";

byte[] param = Request.BinaryRead(Request.ContentLength);
string strRequest = Encoding.ASCII.GetString(param);

StringBuilder sb = new StringBuilder();
sb.Append(strRequest);

foreach (string key in formVals.Keys)
{
sb.AppendFormat("&{0}={1}", key, formVals[key]);
}
strRequest += sb.ToString();
req.ContentLength = strRequest.Length;

//for proxy
//WebProxy proxy = new WebProxy(new Uri("http://urlort#");
//req.Proxy = proxy;
//Send the request to PayPal and get the response
string response = "";
using (StreamWriter streamOut = new StreamWriter(req.GetRequestStream(), System.Text.Encoding.ASCII))
{

streamOut.Write(strRequest);
streamOut.Close();
using (StreamReader streamIn = new StreamReader(req.GetResponse().GetResponseStream()))
{
response = streamIn.ReadToEnd();
}
}

return response;
}

Step 4 – Enabling Sandbox and IPN

I highly recommend that you first test this with the PayPal sandbox. You need to sign up separately for a sandbox account from your live PayPal account from the developer page here.
For sandbox accounts, you do not need to enable IPN, but for live accounts you do. Once you are ready to switch to live, log in to your PayPal account and go to Profile | More Options | My Selling Tools | Choose Update for “Instant Payment Notifications” and then enable IPN and set the URL that you wish PayPal to call (which is the IPN controller we created above).

Summary

At this point you should be ready to go. One of the real pains of this system is that there is no easy way to debug your IPN controller if you are running the MVC app on a remote machine that does not have Visual Studio installed. For this reason, I chose to log each of the steps to a “Log” table that I created in my database. That way I could log all the variables that PayPal returned and still get an idea of what was happening or determine if there were any issues.
I am still testing this and have not yet implemented this into my Cotega service for monitoring SQL Azure databases. For now, if you would like to see this working I am using it in a Windows Phone Baby Monitor app that I wrote.
If you have any suggestions or problems I look forward to hearing from you.

How to Create an RSS Feed from Windows Azure Table Storage Data

May 3rd, 2012

Today I am going to talk about how you can create an RSS feed from Azure Table Storage data. Realistically, you could very easily change this code to create a feed from SQL Azure or any store for that matter (if you get stuck email me at Liam AT Cotega dot com). The code that I am going to use is based on a sample created by DeveloperZen. I am also going to use the trick I blogged about to return the top X rows from Table Storage in reverse chronological order.

The reason I did this is because I needed to be able to provide my customers with a RSS / XML feed of the most recent items logged by the Cotega service. This service logs database statistical information on a users database such as query performance, blocked queries, etc. By providing this data as an RSS feed it is easy for my customers to then embed the feed into their own monitoring systems if they do not wish to use the Cotega dashboard. I am also hoping to use this feed (or another I am creating in JSON) to allow people to build mobile apps so that they can monitor their database from a smartphone. By the way, if you are interested in helping with this, let me know.

How it Works
At the bottom of this post, I have included an MVC3 project that shows how this works. If you open the solution in Visual Studio, you will want to focus on the HomeController.cs file which is located in the /Controllers folder.

For this to work, I will use the ServiceModel Syndication namespace for RSS and the Windows Azure namespaces to connect to my storage. As such, I have added the following to the controller.

using System.Xml;
using System.ServiceModel.Syndication;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.Diagnostics;

Within this controller there is one key functions and two classes. The first class defines the structure of the data that will be queries from Azure table storage and the second defines an ActionResult for the RSS feed. The Feed function is what gets called when you enter http://x.x.x.x/home/feed in your browser. You may want to set a breakpoint on this when you run the code. Some of the things that this function does is:

  • Receive an agentname and username.  These parameters are used to filter the results that come back from Azure storage
  • Create a connection string to my Windows Azure Table store called storageConnectionString . Notice you will need to replace [ACCOUNTNAME] and [ACCOUNTKEY] with your own credentials.
  • Create an EndDt that is set to todays date.  I use this as a trick to sort the results in reverse chronological order.  For more details on why I do this see this page.
  • Take the results of this query and fill up the RSS feed with the 50 most recent items from Table Storage and returns this to the browser
[HttpGet]
public ActionResult Feed(string agentName, string userName)
{
string storageConnectionString = "DefaultEndpointsProtocol=http;AccountName=[ACCOUNTNAME];AccountKey=[ACCOUNTKEY]";

//Select most recent 50 items in desceneding order of date
DateTime endDt = DateTime.Now.ToUniversalTime();

//create syndication feed
SyndicationFeed feed = new SyndicationFeed("Monitoring Agent Feed", "This is an RSS feed for monitoring agent logged data.",
new Uri("http://127.0.0.1:81/alert/feed"), "075211", DateTime.Now);

try
{
// Get the loggged data
CloudStorageAccount Account;
Account = CloudStorageAccount.Parse(storageConnectionString);

// Connect to Azure Tables
CloudTableClient TableClient = new CloudTableClient(Account.TableEndpoint.ToString(), Account.Credentials);
TableServiceContext tableServiceContext = TableClient.GetDataServiceContext();

// Use rowEndKeyToUse so that I can sort the data in reverse chronological order
string rowEndKeyToUse = string.Format("{0:D19}", DateTime.MaxValue.Ticks - endDt.Ticks);

var results = (from g in tableServiceContext.CreateQuery("logevent")
where g.PartitionKey == userName
&& g.RowKey.CompareTo(rowEndKeyToUse) > 0
&& g.EventType.CompareTo(agentName) == 0
select g).Take(50);

//add feed items
List syndicationItems = new List();

foreach (LogEvent logEvent in results)
{
var syndicationItem = new SyndicationItem(String.Format("{0}", logEvent.EventType),
String.Format("Watching: {0}
Value: {1}", logEvent.Watching, logEvent.EventValue),
new Uri("http://cotega.com"),
// You may want to replace this guid with the rowkey and partitionkey if not concerned about publishing this in the feed
System.Guid.NewGuid().ToString(),
logEvent.Timestamp);

syndicationItems.Add(syndicationItem);
}

feed.Items = syndicationItems;
}
catch (Exception ex)
{
//Log error
}

//return feed
return new RssActionResult() { Feed = feed };
}
}

public class LogEvent : TableServiceEntity
{
public string UserName { get; set; }
public string EventType { get; set; }
public int EventValue { get; set; }
public string Watching { get; set; }
public DateTime EventTime { get; set; }
}

///
 /// RssActionResult class. /// 
public class RssActionResult : ActionResult { public SyndicationFeed Feed { get; set; } public override void ExecuteResult(ControllerContext context) { context.HttpContext.Response.ContentType = "application/rss+xml"; Rss20FeedFormatter rssFormatter = new Rss20FeedFormatter(Feed); using (XmlWriter writer = XmlWriter.Create(context.HttpContext.Response.Output)) { rssFormatter.WriteTo(writer); } } }

That’s it. If you have any questions, let me know in the comments.
Download Sample – Azure RSS Feed

What I Learned Building a Startup on Microsoft Cloud Services: Part 12 – Your Customers are my Customers

April 27th, 2012

I am the founder of a startup called Cotega and also a Microsoft employee within the SQL Azure group where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

Over the years I have constantly been amazed at how customers end up using products I work on.  No matter, how much thought and planning has been put into a product it seems you can never fully appreciate how customers will use your products until they actually start using them.  I suppose that is a key reason for building a “Minimum Viable Product” and to get your product out the door as soon as possible so that you can get this feedback an iterate quickly.

One example of this became very clear early on in the Cotega beta.  Initially, I thought that charting of logged data would be really useful for DBA’s to be able to visualize the trends that were happening over time within their database.  This was a key reason for adding this feature.  After talking with various people, I started to realize that although this was in fact useful to DBA’s, what many of them really felt would be useful would be to take these charts and embed them in their own web site so that their customers could see the health of the system.  In a way, their customers would be my customers.   This was really a surprise to me and something I had never thought of.  As it turned out, it was very easy to implement because it was just a matter of creating a new MVC page that accepted the name of the chart and the user name to be used to builds the chart.  Then I could use either <OBJECT> or <IFRAME> tags to take this page and embed it.  Here is an example that shows a historical look at how long it takes to connect to my SQL Azure database. The numbers to the right indicate how many milliseconds it took to complete the connection.


This chart will be loaded dynamically each time you load this page. Here is the code that I used to embed this chart. Notice how, I used iFrame, which is because WordPress does not work well with the Object tag.

<iframe src=”http://cotega.com/home/charts?agentName=Monitor Customer Database&amp;userName=liam” frameborder=”0″ marginwidth=”0″ marginheight=”0″ scrolling=”no” width=”500″ height=”400″></iframe>

Cloud Competitive Advantage

As it turned out, this ended up being a real competitive advantage for me over traditional on-premises monitoring solutions.  Since Cotega is hosted completely in the Azure environment, it is very easy for me to make these charts available to be embedded in customer’s web sites.   If Cotega was an on-premises system, this would have been much more difficult due to firewall and other issues.  Also, if the charting was only used by DBA’s, this capability to embed charts and data would not be nearly as critical.

Protovis Charting

While I am on the subject, you might be interested to learn how I chose to implement the charting.  In the early stages of the beta, I used a charting control from Infragistics.  This control was great and very easy to use (although certainly not cheap).  In the end, I decided to use a charting control from Protovis because it had the ability to copy and paste charts directly from the Dashboard so that they could be used in something like Excel or Word.    If you are using Internet Explorer, give this a try by right clicking on the above chart and choosing paste into a Word document. Very cool, right? The other nice part about Protovis is that it is easy and free.

What I Learned Building a Startup on Microsoft Cloud Services: Part 11 – Vacation Time, Profitability and Listening to the Customers

April 17th, 2012

I am the founder of a startup called Cotega and also a Microsoft employee within the SQL Azure group where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

Vacation Time

It has been a few weeks since my last post and I apologize for that.  One of the big reasons why I have not posted more recently is because I took a vacation with my family.  One of the downsides of starting a business by yourself is that you are never truly on vacation.  It is important that Cotega is always running and although it is pretty self sufficient and (luckily) I have not had any major issues with the service other than that one Azure outage, I am always concerned that something may happen or a customer has a problem.  With most jobs there is always someone that knows how to get a hold of you if there is a major issue and even if you have a co-founder there is always someone there who can contact you if there is a problem.  I think that is really one of the other big advantages of bringing on a co-founder.

Profitability

Since I last posted, I started charging for the service and I am  happy to say that Cotega is now officially profitable.  Yeah!…   Well, let me be more clear about that.  By profitable, I mean I am officially making enough money to cover the costs of operating the service (including those costs that will come when my BizSpark program ends).  I am still a long way from taking any major salary.  But for me this is a big step because it was one of the main goals of starting Cotega.  If you are  reading this and are thinking of doing a startup, I have to tell you that one of the most exciting things to see are those first transactions coming in to your account.  Even the small transactions are incredibly exciting.  I think it has to do with the realization that there are in fact  customers out there that are interested in what you are doing and are willing to pay for it.

New Features and Customer Suggestions

The other big reason why I have not posted is because I have been focusing on some new features for the service.  Yesterday I deployed an update that allows for monitoring of blocked and poor performing queries.    Each of these new features have come from existing customers I have been working with.  I keep the suggestions in the Github issue repository where I can track the features that are most commonly requested and start working on those first.    The other great source of ideas has been from people I have contacted that are not customers at all.  For example, Microsoft has some amazing MVP’s who work closely with different Microsoft technologies and are absolute experts in these products.  Every MVP I have ever worked with has gone way out of their way to help me and some of the best ideas (that are still in the works) have come from these people.

If any of you have not yet tried the service but are interested in seeing the service, I created this code that you can use to try any of the plans for 30 days free: 30dayfree

Liam