MS SQL Server SSRS (brief info):


SQL Server Integration Services (SSRS) is a part of SLQ Server that allows to build various reports.
The SSRS is a separate Windows Service (EXE file) on an MS SQL Server and should be started separately:
A SSRS report is contained in a special file format RDL. Here is an example of an RDL file (during the design stage; not on runtime stage):
Here is an example of design of an RDL file opened in Visual Studio:




-        The “1” is deign of the file;
-        The “2” is a simple status text within the file;
-        The “3” is Tablix control embedded into the Report. It is connected to a Data Source that is denoted by the number “4”. Therefore, at runtime the “3” Tablix man have many rows (that may spread out on many pages) brought from a DB that the “4” refers. The “4” may be either simple SQL query or a call of a huge and sophisticated SQL procedure.
The Reports may be parametrized that improves their flexibility.

Ans inctance of an SSRS often has an URK like http://<SQL Server hostname>/Reports .

Azure Data Factory: Data Flow: how to implements “sink” Stream

This task is described in the following Udemy lesson: https://www.udemy.com/course/learn-azure-data-factory-from-scratch/learn/lecture/23931350#overview :
We already have a set of streams; no we have to finish it with final “sink” stream.

See more in Word file at https://docs.google.com/document/d/1mFTNMsZ9RJcOz5ilXcApZcTGoMtfPIzJ/edit?usp=sharing&ouid=106961909928818620244&rtpof=true&sd=true
.

Azure Data Factory: data Flow: how to do Code Lookup

This task is described in the following Udemy lesson: https://www.udemy.com/course/learn-azure-data-factory-from-scratch/learn/lecture/23931346#overview .

Let’s suppose that we have the following Data Flow item:
It contains several columns; partially a Country:



We would like to do the following transformation:



См. дальше в файле  https://docs.google.com/document/d/1aT2N_FfLWQtPIPauh8pEAf0QNtfHPt4H/edit#heading=h.gjdgxs .

Azure DF: how to read data from HTTP

A problem: I need to read data from an HTTP location and put them into Azure Storage

A solution: create a “source” Linked Service that points to the HTTP site; and a Data Source that points out to a specific file inside the HTTP site. Then create a “sink” Linked Service that points out to the Azure Storage; and a Data Source that points out to a specific file inside the Azure Storage.
Then create a Pipeline that contains a Copy Data activity which specifies the HTTP Data set as its “source”, and specifies the Storage activity as its “sink”.

Let the HTTP file we want to read has address https://raw.githubusercontent.com/cloudboxacademy/covid19/main/ecdc_data/hospital_admissions.csv. Then, let’s first create “source” Linked Service that points to the HTTP site:
Go to Linked Service and create New:


Collapse )

Azure DF: use Lookup -> ForEach to iterate a collection

A problem: I have a collection of data stored in a JSON file:

[

   {

       "sourceRelativeURL":"cloudboxacademy/covid19/raw/main/ecdc_data/cases_deaths.csv",

       "sinkFileName":"cases_deaths.csv"

   },

   {

       "sourceRelativeURL":"cloudboxacademy/covid19/raw/main/ecdc_data/hospital_admissions.csv",

       "sinkFileName":"hospital_admissions.csv"

   }

]

Collapse )