SPARK SQL Equivalents From TSQL

Adina Socaci
2 min readJan 22, 2023

--

Not all methods or names could be used as such from TSQL to Spark SQL.

Here are some examples of the equivalents.

TSQL to Spark SQL

Examples:

TSQL

TOP will be right after the SELECT statement.

SELECT TOP 3 * FROM List;

SPARK SQL

LIMIT will always be the last element of the code block.

SELECT * FROM List
LIMIT 3;

TSQL

SELECT TRY_CAST('12/31/2010' AS datetime2) AS Result;

SPARK SQL

SELECT CAST('12/31/2010' AS date) AS Result;

TSQL

SELECT CONVERT(datetime, '2014-05-02');
Result: '2014-05-02 00:00:00.000'

SPARK SQL

SELECT CAST(to_timestamp('2014-05-02', "MM-dd-yyyy HH:mm:ss.SSS"))

TSQL

SELECT ISNULL(NULL, 'This is not null');

SPARK SQL

coalesce(expr1, expr2, ...) -- Returns the first non-null argument if exists. Otherwise, null.
SELECT coalesce(NULL, 1, NULL);

isnotnull(expr) -- Returns true if expr is not null, or false otherwise.
SELECT isnotnull(1);
Result: true

isnull
isnull(expr) -- Returns true if expr is null, or false otherwise.
SELECT isnull(1);
Result: false

TSQL

SELECT GETDATE();

SPARK SQL

SELECT current_timestamp();
SELECT current_date();

TSQL

FORMAT(GETDATE(), 'yyyy-MM-dd HH:mm:ss', 'de-DE')   

SPARK SQL

In order to get the date and the time, the timestamp should be used.

from_utc_timestamp(current_timestamp(), "Europe/Berlin") 

To only get the date, another built-in function should be used, to_date(). To get the current date use the following function — current_date().

To_date('col name', "yyyyMMdd")

TSQL

SELECT LEN('Give me the length of this string');

SPARK SQL

SELECT LENGTH('Give me the length of this string');

TSQL

CAST(0 as datetime)

SPARK SQL

In Spark, integers cannot be converted as date type. One way to get around this, is to first convert the integer into a string type.

CAST((CAST(0 as string)) as date)

TSQL

NULL

SPARK SQL

Spark cannot process void type values. In order to use it, you could cast it to the desired type.

CAST(NULL as INT)
CAST(NULL as STRING)

It is always useful to check the spark documentation for all the Spark functions. This article presents only some useful examples.

--

--

Adina Socaci
Adina Socaci

Written by Adina Socaci

Tech | Science | Languages | Having Fun

No responses yet