Working with Database-MSSQL Server (1): Introduction to Microsoft SQL Server and Explanation of what we will do in this series of articles.

In the Working with Database series, firstly, we will examine Relational Database Management Systems; Microsoft SQL Server, MySQL, PostgreSQL, and MariaDB, then Amazon RDS as a SQL database. Then, We will examine MongoDB and Amazon DynamoDB as Non-relational management systems (they are commonly referred to as NoSQL databases). We will do all the subjects practically (Hands-on) and step by step.

Cumhur Akkaya
13 min readAug 16, 2023

Relational databases store data in tables. Tables can grow large and have a multitude of columns and records. Relational database management systems (RDBMSs) use SQL (and variants of SQL) to manage the data in these large tables.

Non-relational (NoSQL) databases use a data model that has a different structure than the rows and columns table structure used with RDBMS. NoSQL databases are different from each other. There are four kinds of this database: document databases, key-value stores, column-oriented databases, and graph databases. (1)

Firstly, we will start the article series with a Relational Database Management System: Microsoft SQL Server 2022. We will run Microsoft SQL Server and Apache Webserver + PHP together first in a Container and then in a Kubernetes cluster. We will run the containers first with Dockercompose, then with Amazon Container Service (ECS), and later we will run them with the Kubernetes yaml files by using Helm on the Elastic Kubernetes Service (EKS).

We will enter Microsoft SQL Server data and then query them.

We will troubleshoot in the containers (in Microsoft SQL Server and Webserver). We will examine together the 6 problems and solutions I encountered while setting up Microsoft SQL Server, one by one.

Finally, we will write a Jenkins pipeline, to make them automatically deploy and update in the CI/CD lifecycle.

For Microsoft SQL Server, firstly, we will prepare Dockerfiles and then create two Docker containers from them. One holds a Microsoft SQL database, and the other holds a web server (Apache) with a pre-defined PHP script in order to test the Database. We will write the Dockerfiles, and install a proper driver to connect to the Microsoft SQL Server step by step by looking at the relevant Microsoft and Docker Hub pages.

We will write a Launcher, which starts both containers. For this, firstly, We will write Docker compose file for Docker containers. Then, write a task defination file for Amazon Container Service (ECS), and finally, write Kubernetes yaml files for Elastic Kubernetes Service (EKS).

Running with the Docker compose file in Docker containers.
Running with the task defination file on the Amazon Container Service (ECS) cluster (Fargate)
Running with the Kubernetes yaml files by using Helm on the Elastic Kubernetes Service (EKS).

Topics we will cover in this article:

Explanation of what we will do in this series of articles (There are videos of the outputs I obtained at the end of my hands-on experiments with Microsoft SQL Server 2022 in this series of articles.)

Full Index of the Microsoft SQL Server’s Article

Step-1. Preparing development environment to run the containers.

Step-2 Preparing GitHub repository

As a result

Next post

References

If you like the article, I will be happy if you click on the Medium Following button to encourage me to write more, and not miss future articles.

Your clap, follow, or subscribe, they help my articles to reach the broader audience. Thank you in advance for them.

Explanation of what we will do in this series of articles:

In the practical work that we will do with the database, we will obtain the following outputs at the end of each chapter.

Below are the videos of the outputs that I obtained at the end of my hands-on work with Microsoft SQL Server 2022 in this series of articles.

1. When we run with the Docker compose, we will get the results shown in the video below;

If you want to try it too, just run the following command in this folder path: “mssql-server-handson/with-docker-compose”.

# Prerequisites​: Docker and Dockercompose must be installed.
docker-compose up - build

2. When we run with the Amazon ECS, we will get the results shown in the video below;

If you want to try it too, just run the following command in this folder path: “mssql-server-handson/with-ecs”

# Prerequisites​: AWS CLI and required credentials must be installed. Also, you must specify; inside the aws `ecs create-service` command; your own subnet and security group and inside the `taskdefination.json`; your own image URI.
aws ecs create-cluster - cluster-name cumhur-vero-digital
aws ecs register-task-definition - cli-input-json file://cumhur-vero-digital-ecs-taskdefination.json
aws ecs create-service - cluster cumhur-vero-digital - service-name cumhur-vero-digital-service - task-definition cumhur-vero-digital - desired-count 1 - launch-type "FARGATE" - network-configuration "awsvpcConfiguration={subnets=[subnet-06a91b5b2635f27d9],securityGroups=[sg-07054ec51b1a0cc5f],assignPublicIp=ENABLED}"

3. When we run with the Amazon ECS by using Jenkins Pipeline and GitHub WebHook, we will get the results shown in the video below;

If you want to try it too, just run the Jenkins pipeline from “Jenkinsfile” in this folder path: “mssql-server-handson/with-ecs”

4. When we run with the Amazon EKS, we will get the results shown in the video below;

In the Working with Database series, I will try to write and explain Microsoft SQL Server, which is the first part of this series of articles, within the following plan below;

Full Index of Microsoft SQL Server’s Article :

Step-1. Preparing development environment to run the containers.

Step-2. Preparing GitHub repository

  • 2. a. Creating a DevOps repository for the Development Stage
  • 2. b. Working in different branches for this project
  • 2. c. Pushing created files to the GitHub repo

Step-3. Preparing Dockerfiles

  • 3. a. Writing a Dockerfile for MS SQL Server
  • What is MS SQL Server
  • Important notes 1–4
  • Dockerfile for MS SQL Server
  • 3. b. Writing a Dockerfile for Webserver(Apache) + Php
  • Important notes 1–5
  • Dockerfile for Webserver(Apache)+Php

Step-4. For security, changing the system administrator password and Using secret or .env files.

Step-5. Troubleshooting in the containers

  • Error-1: [sqlsrv_errors] => SQLSTATE[HYT00]: [Microsoft][ODBC Driver 17 for SQL Server]Login timeout expired) DB failed to connect.
  • Error-2: “sqlcmd: command not found” sqlcmd not available after installing MySQL-tools.
  • Error-3: Couldn’t connect to Docker daemon at http+docker://localhost
  • Error-4: for PHP can not start service php: driver failed…
  • Error-5: /opt/mssql/bin/sqlservr:Error: The system directory [/.system] could not be created. File: LinuxDirectory.cpp:420 [Status: 0xC0000022 Access Denied errno = 0xD(13) Permission denied] devops-task-vero_db_1 exited with code 1.
  • Error-6: Login failed for user ‘SA’. Reason: Password did not match that for the login provided. [CLIENT: 127.0.0.1]

Writing a launcher, which builds and starts both containers

Step-6. Using Dockercompose OR Amazon Container Service (ECS)

Step-6.1. Running container by using Docker compose

  • 6.1.1. Write a Docker compose file for the deployment
  • 6.1.2. Check EC2 instance Security Group
  • 6.1.3. Run Docker compose file and connect to SQL server
  • 6.1.4. Create a database by using sqlcmd commands.
  • 6.1.5. Insert data by using sqlcmd commands.
  • 6.1.6. Run a query to return data from the Inventory table.
  • 6.1.7. Trying the volumes we created
  • 6.1.7.a. Testing the volume data that we connected to the db (MSSQL Server Database) container
  • 6.1.7.b. Testing the volume src that we connected to the web container
  • 6.1.8. Using phpMyAdmin
  • 6.1.9. Using VScode extension

Step-6.2. Running container by using Amazon ECS

  • 6.2.1. Creating ECR Repository
  • 6.2.2. Pushing Docker image to ECR Repository
  • 6.2.3. Creating role: ecsTaskExecutionRole for ECS
  • 6.2.4. Deploy to the ECS cluster
  • 6.2.4. a. Creating an ECS cluster
  • 6.2.4. b. Creating a task definition
  • 6.2.4. c. Creating a service with AWS ECS Fargate and Outputs
  • 6.2.4. d. Deleting a service in AWS ECS Fargate

Step-6.3. With Amazon ECS by using scripts in Jenkins pipeline.

  • 6.3.1. Creating a new folder for the Amazon ECS test
  • 6.3.2. Creating a “GitHub Personal Access Token” on GitHub.
  • 6.3.3. Creating a GitHub Webhook to auto-trigger the Vero Digital application pipeline
  • 6.3.4. Creating role: ecsTaskExecutionRole for ECS
  • 6.3.5. Creating an ECS cluster
  • 6.3.6. Creating a task definition file
  • 6.3.7. Registering the task definition file
  • 6.3.8. Creating a service (Vero Digital web server and MSSQL Server)
  • 6.3.9. Preparing and explaining a Jenkinsfile for the Vero Digital app pipeline
  • 6.3.10. Pushing created files to the remote repo (GitHub)
  • 6.3.11. Preparing and Running a Jenkins pipeline
  • 6.3.12. Troubleshooting in the Jenkinsfile staging pipeline
  • 6.3.13. The output of the Jenkinsfile staging pipeline on Amazon ECS

Step-7. With Kubernetes cluster

Step-7.1. Using Amazon Elastic Kubernetes Service (EKS) with Jenkins pipeline.

  • 7.1.1. Preparing development environment to run the Jenkins pipeline.
  • 7.1.2. Creating and working in different branches on Jenkins Server
  • 7.1.3. Creating Kubernetes manifesto yaml files Using Kompose converter tool
  • 7.1.4. Setting up a Helm v3 chart repository in Amazon S3 for CI/CD pipeline in Jenkins
  • 7.1.5. Creating Amazon S3 Bucket using AWS Management Console
  • 7.1.6. Installing and Using the Helm-S3 plugin
  • 7.1.7. Creating “A” record in AWS Route 53 for the Vero Digital Microservice app
  • 7.1.8. Connecting and configuring Jenkins
  • 7.1.9. Creating a repo on AWS ECR using the Jenkins pipeline
  • 7.1.10. Creating Jenkins pipeline scripts
  • 7.1.10.a. Preparing a script to tag AWS ECR repo tags
  • 7.1.10.b. Preparing a script to build the production docker images
  • 7.1.10.c. Preparing a script to push the production docker images
  • 7.1.10.d. Preparing a Jenkinsfile script
  • 7.1.11. Creating values-template.yaml and changing the domain name in it
  • 7.1.12. Getting a TLS (Transport Layer Security) certificate for HTTPS connection using Let’s Encrypt, Cert Manager
  • 7.1.13. Pushing created files to the remote repo (GitHub)
  • 7.1.14. Creating Amazon EKS cluster
  • 7.1.14.1. Creating cluster.yaml
  • 7.1.14.2. Creating a cluster using eksctl
  • 7.1.14.3. Creating ingress controller
  • 7.1.14.4. Checking the cluster
  • 7.1.15. Checking to push digital-helm-chart to Amazon S3 bucket
  • 7.1.16. Running a Kubernetes Jenkinsfile pipeline
  • 7.1.17. The output of the Kubernetes Jenkinsfile pipeline

Step-1. Preparing development environment to run the containers.

Running with the Docker compose file in Docker containers.

Firstly, I prepared a development environment to run the containers in it, as shown in the figure above. For this,

- I used AWS cloud EC2 instance (t2.small (it has 2 Vcpu and 2 Gib memory), 8 Gib volume, Amazon 2023 Linux AMİ).

- I created a new Security Group, Inbound rule allows port 1433 and 80 anywhere (necessary). Also, I opened ports 443 and 8000 in the Inbound rule (optional).

- I installed the programs necessary to run the test application, such as Docker and Docker-compose, AWS CLI Version2, and Git.

SQL Server has the following system requirements for Linux (Source: Installation guidance for SQL Server on Linux page): (2)

Requirement
Memory 2 GB
File System XFS or EXT4 (other file systems, such as BTRFS, aren't supported)
Disk space 6 GB
Processor speed 2 GHz
Processor cores 2 cores
Processor type x64-compatible only
If you use Network File System (NFS) remote shares in production, note the following support requirements:
Use NFS version 4.2 or higher. Older versions of NFS don't support required features, such as fallocate and sparse file creation, common to modern file systems.
Locate only the /var/opt/mssql directories on the NFS mount. Other files, such as the SQL Server system binaries, aren't supported.

Note: To ensure optimal performance of SQL Server, our machine should have at least 4 GB memory (Source: Create PHP apps using SQL Server on Ubuntu page). (3)

Note: When I use Amazon ECS, Amazon EKS, Jenkins, and Rancher for this task (Part 6.3. and later), I will use a Jenkins server with different tools installed in it, as shown in the figures below.

Running with the Kubernetes yaml files by using Helm on the Elastic Kubernetes Service (EKS).
Running with the task defination file on the Amazon Container Service (ECS) cluster (Fargate)

For detailed information about the development server preparation, you can review my medium article. Working with Microservices-1: Running a Java app that consists of 10 Microservices on a Development server. (In this article series, we will work with a Spring pet clinic application consisting of 10 microservices on Development, Testing, Staging, and Production environments by using different DevOps tools.)

Step-2 Preparing GitHub repository

2.a. Creating DevOps-task repository for the Development Stage

I created a repository in GitHub, and then cloned it into my local Server.

After I made the necessary arrangements into it (Created folders order, and added QuickDbTest.php file in the mssql-server-handson/with-docker-compose repository). Then, I pushed the new repo I created on GitHub with the following commands, as shown picture below.

git add .
git commit -m 'Arranged folders order, added QuickDbTest.php file, etc. for the main branch'
git push

2.b. Working in different branches for this projcet

I created a development branch, with “git branch development” command. And switched to the `development` branch, with “git checkout development” command, and checked it as shown in the picture.

git branch development
git checkout development
git branch
main
* development

In this way, when an error occurs, we can recover the project by reverting to a working branch. This is the purpose and biggest benefit of using git anyway. (4)

2. c. Pushing created files to the GitHub repo

After I prepared all the files (This is in the 6th step or in the intermediate steps when needed), I committed the changes, then pushed the created files (the dockerfiles, the docker-compose file, etc.) to the remote repo (GitHub). Then, checkout to the `main` branch. Run the command below, as shown in the picture below.

git add .
git commit -m 'added docker files and docker-compose file, etc. in the development branch'
git push - set-upstream origin development
git checkout main # In the production stage.
My GitHub repo

The best practice is to create a different branch and continue from the new branch in the next stage.

If you liked the article, I would be happy if you click on the Medium Following button to encourage me to write and not miss future articles.

Your clapping, following, or subscribing, they helps my articles to reach a broader audience, thank you in advance for them.

As a result

In this part of the series, I explain what we will do in this series of articles. We saw the outputs that we will get at the end of each big step.

We prepared a development environment to run the containers and created a DevOps repository for the Development Stage in the Developer server and GitHub.

Actually, we will prepare a sample production CI/CD process at the same time with Microsoft SQL Server. What we will do here can also be used in similar projects.

You can find the necessary files in my GitHub repo. Together with you, I will repeat this hands-on that I created and made it. Files will be created and accumulated in this repo when they are written in the steps of the article. (https://github.com/cmakkaya/mssql-server-handson).

Share this article with friends in your network and help others to upskill.

If you liked the article, I would be happy if you click on the Medium Following button to encourage me to write and not miss future articles.

Your clap, follow, or subscribe, they help my articles to reach the broader audience. Thank you in advance for them.

For more info and questions, please contact me on Linkedin or Medium.

Next post

I came to the 15th of my “Working with Microservices” article series consisting of 22 parts and I have published “Working with Microservices-15: Importing Amazon EKS cluster to Rancher in the Production Stage”. I will try to write 2 series of articles together, I hope they will be useful for everyone.

EDITING: In the Next post, I will talk about the problems and solutions I encountered while installing and running the Microsoft SQL Server. I will write the resources and solutions I used to fix the errors. Normally, the “Writing Dockerfiles and Docker compose file for Web app and MS SQL Server” article was next, but due to requests from readers, I gave priority to the Working with Database: Step 5- CSI: Troubleshooting in Microsoft SQL Server containers article and published it now.

IT WILL BE PUBLISHED LATER: In the next post, we will write Dockerfiles for MS SQL Server and Webserver(Apache)+PHP. Dockerfiles holds a Microsoft SQL database, and a Web-Server (Apache) a pre-defined PHP script. We will write a proper driver to connect to the Microsoft SQL Server step by step by looking at the relevant Microsoft and Docker Hub pages.

Then we will create two docker containers, one run an MSSQL database, and another one run a Web-Server with a pre-defined PHP script.
* MSSQL-Server Docker Container (Database): It will perform MSSQL Server installation, set a password for SA, and run MSSQL Service.
* Web-Server Docker Container (API): It will install Apache Webserver, install PHP 8.1, install the proper driver to connect to MSSQL Server, and add the script QuickDbTest.php to the web-root folder).

Running with the Docker compose
As a result of running with the Docker compose, we will get the results shown in the video above.

I hope you enjoyed reading this article. You can share this article with friends in your network and help them to upskill.

I frequently share articles about Cloud and DevOps tools and resources, if you follow me on my Medium or LinkedIn account you don’t miss future articles. I wish you growing success in the DevOps and the Cloud way.

Happy Clouding…

--

--

Cumhur Akkaya

✦ DevOps/Cloud Engineer, ✦ Believes in learning by doing, ✦ Dedication To Lifelong Learning, ✦ Tea and Coffee Drinker. ✦ Linkedin: linkedin.com/in/cumhurakkaya