Monitoring SQL Server database status changes using T-SQL and PowerShell scripts

Monitoring a SQL Server database is a critical component of database administration. Ninety percent of the organizations expect the number of databases to increase over the next twelve months. An increase in data volumes can have negative effects on the availability of databases. Hence, SQL Server database monitoring is considered a critical responsibility of a database administrator. Organizations tend to spend a lot of their funds towards enterprise solutions. And due to the sensitive and growing nature of business and user needs, application availability is very important nowadays.

Data Flow Diagram

Let’s look at the high-level view of this implementation:

  1. A dedicated SQL Instance has the SQL Server DB Mail configured
  2. In the dedicated instance, a SQL Server agent job is used to schedule SQL Jobs
  3. It contains the stored procedure to perform XML data transformation. Also, it takes care of sending the email to intended recipients, using SMTP
  4. The SQL file that contains the code to be executed on the listed SQL instance is placed in a shared path. The path should be accessible from all the SQL instances.

Further reading

Monitoring SQL Server database status changes using T-SQL and PowerShell scripts

Please share your thoughts in the comments section. I would love to hear and learn from you as well.

Thanks for reading my space!

Posted in SQL, T-SQL, XML | Tagged , , , | Leave a comment

The evolution of SQL Server towards Digital Transformation challenges

SQL Server on Linux boosts the database market for Microsoft. Support of PolyBase (a feature to work with Big Data providers), In-Memory Optimized SQL Server, Real-time Operational Analytics, the scaling of Python and R Services for Data analytics, Graph database for NoSQL data, JSON support for transparent data interchange format between traditional and non-traditional database systems, Azure Cosmos DB from Document database to distribution database… All this helps leverage SQL Server to almost every extent, in day-to-day activities.

The shift in technology is being driven by increased expectations. The time-to-market is lower when it comes to applications; the competition is fierce. Also, there’s a lot of unstructured data floating in the ether, such as videos, images, audio, etc., which are more prevalent and problematic for traditional databases. And SQL Server 2017 has emerged, attempting to answer these calls.

It does, though, seem to have the potential to be seen as a powerhouse of a number of desirable features. It is a little early to say whether SQL Server 2017 would become an answer to the myriad of requirements we have; it may also require a lot of fine-tuning and improvisation. But it is perhaps safe to say that Microsoft does seem to be taking it seriously and taking the necessary steps.

Evolution of SQL Server towards Digital Transformation

Please share your thoughts in the comments section. I would love to hear and learn from you as well.

Thanks for reading my space!

Posted in SQL 2016, SQL 2017 on Linux | Tagged , | Leave a comment

How to configure Always Encrypted in SQL Server 2016 using SSMS, PowerShell and T-SQL

In an era of remote storage and retrieval of data, including the cloud, data security plays a vital role, especially since it’s vulnerable during the transit. Situations like database backup or copy from or to the cloud, there is always a risk of data exposure to outside world lurking around one corner or the other. We have seen a noticeable surge in the technologies around protection and security of data from the world full of unsafe hands. Efforts are being made to protect data at a very granular level of the encryption hierarchy. Protection of business data cannot be stressed upon more.

One way of inching towards the more secure transmission of data is to enable Always Encrypted on the database. We’ll look into the various options we have, including enabling this at granular levels; we’ll look at enabling this at the column level.

The Always Encrypted feature was available only in the Enterprise and Developer editions of SQL Server 2016. Later, this feature was made available on all editions, with SQL Server 2016 SP1. Always Encrypted has the ability to encrypt data even at the column level.

There are several ways to configure the Always Encrypted feature:

  1. Using the Always Encrypted wizard
  2. Configuring AE using SSMS
  3. Create Master Key and Encryption Key using T-SQL and enabling encryption
  4. Configuring Always Encrypted using PowerShell

How to configure Always Encrypted in SQL Server 2016 using SSMS, PowerShell and T-SQL

Please share your thoughts in the comments section. I would love to hear and learn from you as well.

Thanks for reading my space!

Posted in SQL 2016 | Tagged , | Leave a comment

Recover Deleted Data from SQL Table Using Transaction Log and LSNs


At times a user may perform UPDATE operation or DELETE operation in SQL server database without applying the WHERE condition. This is a very common reason for encountering loss of data from SQL Server Tables. As the SQL Server database is a highly popular relational DBMS among corporate sectors and businesses, the data loss problem magnifies even more. So users should be aware of the methods to recover deleted data from SQL Server Table in case of any mishaps.

Deleted rows can be recovered if the time of their deletion is known. This can be done through the use of Log Sequence Numbers(LSNs). LSN is a unique identifier given to every record present in the SQL Server transaction log. The upcoming section will discuss the process to recover deleted SQL Server data and tables with the help of Transaction log and LSNs.

Recommended: For avoiding the long and erroneous manual LSN approach to recover deleted records from SQL table make use of an advanced third-party software such as the SQL Database Repair.

Recover Deleted Data from SQL Server Table By Transaction Logs

There are some prerequisites to be fulfilled before we start the process to recover deleted data from SQL table. To easily recover deleted rows from a table in SQL Server database, it must have the BULK-LOGGED or FULL Recovery Model at the time when the deletion first occurred. Some quick actions are required so that the logs are still available to perform the recovery of data.

Follow the steps given below to recover deleted data from SQL Server 2005, 2008, 2012, 2014, and 2016 by the use of Transaction logs

Step 1: Check the number of rows present in the Table from which the data has been accidentally deleted using the below-mentioned query

SELECT * FROM Table_name

Step 2: Take the transaction log backup of the database using the query given below

USE Databasename
BACKUP LOG [Databasename]
TO DISK = N'D:\Databasename\RDDTrLog.trn'
NAME = N'Databasename-Transaction Log Backup',

Step 3: In order to recover deleted data from SQL Server Table, we need to collect some information about the deleted rows. Run the query given below to achieve this purpose

USE Databasename
Select [Current LSN] LSN], [Transaction ID], Operation, Context, AllocUnitName
fn_dblog(NULL, NULL)

From the query given above, we will obtain Transaction ID(Let’s say 000:000001f3) of the deleted rows. Now the time when these rows were deleted is to be determined using this ID.

Step 4: In this step, we will find the specific time at which rows were deleted using the transaction ID 000:000001f3. This is done by executing the query given as follows

USE Databasename
[Current LSN],  Operation, [Transaction ID], [Begin Time], [Transaction Name], [Transaction SID]
fn_dblog(NULL, NULL)
[Transaction ID] = ‘000:000001f3'
[Operation] = 'LOP_BEGIN_XACT'

On executing this query we will get the value of current Log Sequence Number(LSN) (Let’s say 00000020:000001d0:0001)

Step 5: Now we will start the restore process to recover deleted data from SQL Server table rows that was lost. This is done using the below query

USE Databasename
DISK = 'D:\Databasename\RDDFull.bak'
MOVE 'Databasename' TO 'D:\RecoverDB\Databasename.mdf',
MOVE 'Databasename_log' TO 'D:\RecoverDB\Databasename_log.ldf',

Step 6: Now apply transaction log to restore deleted rows by using LSN “00000020:000001d0:0001”

USE  Databasename
RESTORE LOG Databasename_COPY FROM DISK = N'D:\Databasename\RDOTrLog.trn' WITH STOPBEFOREMARK = ‘lsn:0x00000020:000001d0:0001'  Note: Since LSN values are in Hexadecimal form and for restoring tables using this LSN, we need to convert it into decimal form. For this purpose, we add 0x just before the LSN as shown above.

Step 7: The process to recover deleted records from SQL table will get completed successfully. Now check whether the deleted records are back in the database named ‘Databasename_Copy’.

USE  Databasename_Copy
Select * from Table_name

Disadvantages of Transaction Log Approach

  • Highly time-consuming method to recover deleted data from SQL Server table as it involves several lengthy queries to be executed
  • Extremely complex to implement for users, not possessing adequate technical knowledge
  • Greater chances of losing data due to errors while application and execution of queries

Although the Log Sequence Number method can restore deleted records from SQL Tables, it is not a recommended option for users due to its complexity and tediousness. Instead, it is advised to use an automated solution to recover deleted data from SQL Server table.


Posted in SQL, SQL 2016, SQL Server 2017 | Tagged , , , | 1 Comment

SQL Server DB Migration – Cloning a database to another collation

Database migration is a vital task in any environment, complex or otherwise. Seamless migrations are the goal but the efforts required to ensure it are tremendous.

A global SQL Server Collation Sequence update is the answer to the puzzle, but it is a multi-step operation. The first step is to identify the case sensitivity of the collation assigned to the repository database. If you find that the repository database is not case sensitive, you must create a new empty (blank) case-sensitive database to migrate the data to. You must then import the data from the old case-insensitive repository database to the new empty case-sensitive database, and then establish the new case-sensitive database as the repository database.

further reading…


Please share your thoughts in the comments section. I would love to hear and learn from you as well.

Thanks for reading my space!


Posted in SQL, T-SQL | Tagged , , | Leave a comment

How to Capture Database(s) usage stats using sp_spaceused and PowerShell

The objective of this article is to look at the possible ways of capturing database usage trends using the most commonly available tools and techniques. The integration of SQL, the use of sp_spaceused, and the flexibility offered by PowerShell have been used to get the required database growth patterns; this data is combined into a single result. After the transformation is done, the data is stored in a dedicated repository.

Continue reading….


Wrap Up

In this article, we looked at modifying the output of sp_spaceused system stored procedure, and fetching the raw data that can be put to use after some transformations. We primarily looked at overcoming the challenge of combining the two result sets, using PowerShell to combine the data into a friendly JSON report, which can then be transformed into a SQL table, stored in a central repository; we can now use this repository as a source of raw data which can be used to generate meaningful reports. Also, we saw the integration with OPENROWSET (BULK) function to import JSON files into SQL Server using JSON constructs.

Fetching the raw data and storing is important. When processed, this data would be helpful in capacity planning and forecasting. We shall discuss on how to put this data to use in capacity planning, in a separate article.

Happy Learning…



Posted in JSON, SQL, SQL 2012, SQL 2016, T-SQL | Tagged , , , , | Leave a comment

Multi-server Script to Find Orphaned Data Files using PowerShell

In this article, we take the sledgehammer approach in searching for MDFs and then comparing them against files which are available on the listed drive. The idea behind this is to generate a report of detached database files across multiple servers. That’s one way to keep our databases compliant in terms of meeting SLAs, and ensuring optimal performance. Running such reports periodically helps us to achieve compliance in terms of file auditing.

Further reading multi-server-script-find-orphaned-data-files-using-powershell

Please share your thoughts in the comments section. I would love to hear and learn from you as well.

Thanks for reading my space!



Posted in SQL, Uncategorized | Tagged , , | Leave a comment

Planning a SQL Server Backup and Restore strategy in a multi-server environment using PowerShell and T-SQL

This post demonstrates one of the ways to gather an inventory of database backup information. The output of the script includes various columns that show the internals of a database, backup size, and gives the latest completion statuses and the corresponding backup sizes. Though the output is derived from the msdb database, having the data consolidated at one place gives us better control and provides us with greater visibility into the database process. A clear understanding of these parameters is a good way to forecast storage requirements. We can schedule a job to pull the data to a central repository and then develop a report on capacity planning and forecasting on a broader spectrum of database backup. That way, we’d have an insight into the sizing of every backup type, such as Full, Differential, and Log. With such information, we can easily decide on the type of backup required at a very granular level, i.e., at the database level.

Getting started

There are many ways to gather data in a central server repository such as using T-SQL and PowerShell. In this section, I’m going to discuss the data gathering tasks using PowerShell cmdlets.

The pre-requisites are

  • Require SSMS version 16.4.1
  • SQL Server PowerShell module

New cmdlets have been introduced with the SQL Server module, which is going to replace SQLPS, by retaining the old functionality of SQLPS with added set of rich libraries. It is safe to remove the SQLPS and load the SQL Server module.

Further Reading


Happy Learning!!


Posted in SQL, SQL Server Backup | Tagged , , , , | Leave a comment

A Quick Start Guide to Managing SQL Server 2017 on CentOS/RHEL Using the SSH Protocol

This article details the following

  1. How to get PuTTY and SQL Server
  2. How to install/uninstall SQL 2017 CTP2.1
  3. How to Upgrade to SQL 2017 RC1
  4. How to Install/remove SQL Tools

In one of my previous articles Installation of SQL Server vNext CTP on Linux Distribution CentOS 7, I spoke about the installation of SQL Server 2017 on a Linux computer. Let us now look at another way to showcase that a SQL Server installation and configuration can also be managed using the Secure Shell (SSH) protocol using PuTTY. We shall look at the important details, such as, how to get PuTTY, SQL Server, how to install/uninstall SQL 2017 CTP2.1, Upgrade to SQL 2017 RC1, and install/remove SQL Tools.

Further reading, A Quick Start Guide to Managing SQL Server 2017 on CentOS/RHEL Using the SSH Protocol

Suggestions and comments are most welcome!

Happy Learning!!






Posted in SQL 2017 on Linux, SQL Server 2017 | Tagged , , , | Leave a comment

How to Configure TDE database with AlwaysOn using the Azure Key Vault in SQL Server 2016

One of the recent tasks that I have undertaken seemed rather like an experiment to me. I set out to configure Transparent Data encryption (TDE) using asymmetric key protection with Azure Key Vault with AlwaysOn. That, to me, opened a whole different dimension on data security.


Happy Learning!!

Posted in AlwaysOn, SQL 2016, SQL AZURE | Leave a comment