A change has been rolling out recently to the Dataverse TDS Endpoint that affects how date/time values are returned, which should give your queries better performance but may mean you’ve got to do a bit more work to ensure the values are shown as your users expect.

Previously the endpoint would automatically convert date/time values to the time zone of the user running the query, which was convenient but highly inefficient. In fact, if you sorted or filtered the data by a date/time column or retrieved too many date/time values it was easy to hit timeout errors.

The change that has rolled out recently removes this conversion and always returns the values in UTC time zone. As this is the format they’re held in natively within the database, this makes it much quicker to run. There’s now also no need to refer to columns with a utc suffix, so you can write:

SELECT TOP 10 name
FROM          account
ORDER BY      createdon DESC

to get the names of the most recently created accounts efficiently rather than having to use the createdonutc column as previously. In fact, those utc-suffix columns no longer exist and you’ll get an error if you try to use them.

Note: If this change has not been applied to your instance yet, try disabling and re-enabling the TDS Endpoint. After a few minutes the changes should be ready.

Displaying results in local time zone

So now we can run the query better, how about displaying the results? Unfortunately we’re now on our own to convert the UTC values into the correct time zone. Depending on how you’re accessing the TDS Endpoint you’ve got a few options here.

Native SQL Conversion

Unfortunately the native AT TIME ZONE syntax in T-SQL isn’t supported by the TDS Endpoint, so we need to write the logic for doing the time zone conversion manually.

If you don’t need to worry about daylight saving time this can be pretty straightforward, but in most cases we need to be more accurate.

Provided you know which time zone you’re targeting you can encode the conversion rules in your query. This isn’t pretty but works perfectly well. For example, for the PST/PDT time zone, this is 8 hours behind UTC or 9 hours during daylight savings time. Daylight savings is active from the second Sunday in March to the first Sunday in November, so we can use this query to return the createdon date in both UTC and local time:

SELECT TOP 100 createdon,
               DATEADD(hour, -8, createdon) as createdon_pst,
               DATEADD(hour, -7, createdon) as createdon_pdt,
               CASE WHEN createdon >= DATEADD(hour, 8, DATEADD(day, 7 + (8 - DATEPART(dw, DATEFROMPARTS(DATEPART(year, createdon), 3, 1))) % 7, DATETIMEFROMPARTS(DATEPART(year, createdon), 3, 1, 2, 0, 0, 0)))
                    AND createdon < DATEADD(hour, 8, DATEADD(day, (8 - DATEPART(dw, DATEFROMPARTS(DATEPART(year, createdon), 11, 1))) % 7, DATETIMEFROMPARTS(DATEPART(year, createdon), 11, 1, 1, 0, 0, 0)))
                    THEN DATEADD(hour, -7, createdon)
                    ELSE DATEADD(hour, -8, createdon)
               END AS createdon_pt
FROM           account
ORDER BY       createdon DESC

This shows simply converting the UTC time to PST (UTC-8) and PDT (UTC-7). The more complex part is working out whether a particular date/time falls into daylight saving time or not and selecting PST or PDT appropriately.

Given the year:
DATEPART(year, createdon)
we can generate the 1st March of that year:
DATEFROMPARTS(<year>, 3, 1)
From that we can work out the number of days to add to get to the second Sunday:
7 + (8 - DATEPART(dw, <1stMarch>)) % 7
The change to daylight saving time happens at 2am:
DATEFROMPARTS(<year>, 3, 1, 2, 0, 0, 0)
By adding these last two together we get the time of the switch to daylight saving:
DATEADD(day, <daysToSecondSunday>, <2am1stMarch>)
This time of 2am is in PST, so we need to convert that to UTC:
DATEADD(hour, 8, <dstSwitchTime>)

We need to repeat the same calculation to get the time daylight saving ends for the year by substituting in 3am on the 1st Sunday in November. We can then check if the createdon date falls between these two times. If it does it should be in PDT and we subtract 7 hours, otherwise it’s PST and we subtract 8 hours.

C# Application

If we have a custom application written in C# we can easily convert the returned UTC values to the local time zone the user is running the application in using standard methods on the DateTime class:

while (reader.Read())
  var dt = reader.GetDateTime(reader.GetOrdinal("createdon"));
  dt = dt.ToLocalTime();

Filtering results

To filter the results based on a time the user has supplied in their local time zone we have to do the reverse operation to convert that value back to UTC to use that for filtering.

In SQL this would look like:

DECLARE @localtime datetime = '<user supplied value>'
DECLARE @year int = DATEPART(year, @localtime)
DECLARE @utctime datetime

DECLARE @dststart datetime = DATEADD(day, 7 + (8 - DATEPART(dw, DATEFROMPARTS(@year, 3, 1))) % 7, DATETIMEFROMPARTS(@year, 3, 1, 3, 0, 0, 0))
DECLARE @dstend datetime = DATEADD(day, (8 - DATEPART(dw, DATEFROMPARTS(@year, 11, 1))) % 7, DATETIMEFROMPARTS(@year, 11, 1, 2, 0, 0, 0))

IF @localtime >= @dststart
  SET @utctime = DATEADD(hour, 7, @localtime)
  SET @utctime = DATEADD(hour, 8, @localtime)

FROM   account
WHERE  createdon > @utctime

Or in C#:

public void ListAccountsCreatedSince(DateTime localTime)
  var utcTime = localTime.ToUniversalTime();

  using (var con = new SqlConnection(connectionString))
  using (var cmd = con.CreateCommand())
    cmd.CommandText = "SELECT name FROM account WHERE createdon > @utcTime";
    cmd.Parameter.AddWithValue("@utcTime", utcTime);

    using (var reader = cmd.ExecuteReader())
      while (reader.Read())

Power BI

Currently I haven’t found a method in Power BI to handle the conversion to the local time zone to display results or converting local time to UTC for filtering. If you know a way of doing this, please let me know in the comments!

2 thoughts on “TDS Endpoint Time Zone Change”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.