SQL 2012: Let's Make a Date with DATEFROMPARTS and TRY_PARSE!
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[tblDateTime]') AND type in (N'U')) DROP TABLE [dbo].[tblDateTime] GO IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[tblDateTimeToConvert]') AND type in (N'U')) DROP TABLE [dbo].tblDateTimeToConvert GO Create table tblDateTimeToConvert ( col1 varchar(9) null) Create table tblDateTime ( DT datetime null) GO -- Populate with sample data Insert into tblDateTimeToConvert Values ('19660612'), ('19881220'), ('19821115'), ('19851010'), ('19690213'), ('197/10526'), ('19730601'), ('19760117'), ('197 70924'), ('19790713'), ('19760811'), ('19710113'), ('19640126'), ('19710121'), ('19730"422'), ('19631026'), ('1954 0924'), ('19810807'), ('19791228'), ('19720323'), ('19780507'), ('19570124'), ('19860402'), ('19670217'), ('19830607')This code will create two tables: one to store our sample "dates" and one to insert the correctly converted date data into. Our sample data will be stored in tblDateTimeToConvert as character data in the format YYYYMMDD. To make things a bit more interesting, rather then inserting the straight numbers, I've added some "junk" characters and steps to fix them. After the table is populated, you can view it with a simple select query: Select * from tblDateTimeToConvert. If you're lucky, your data will be uniform. I'm rarely that lucky so I've laced the data with junk. I tend to look at the data for patterns, then write code to clean it up. I started simply with:
Select * from tblDateTimeToConvert where ISNUMERIC(col1) = 0...which returns 4 records. So let's clean the data up using Replace.
-- Remove the slashes Update tblDateTimeToConvert Set Col1 = replace(col1, '/', '') where col1 like '%/%' -- Remove the spaces Update tblDateTimeToConvert Set Col1 = replace(col1, ' ', '') where col1 like '% %' -- Remove the double quote Update tblDateTimeToConvert Set Col1 = replace(col1, '"', '') where col1 like '%"%'This isn't necessarily the most efficient way of searching and cleaning up the data, but it works. Now that our data is clean, let's look at the "old" way I would have used to populate the table.
-- Look at the data: Select Convert(date, (left(col1, 4) + '-' + substring(col1, 5, 2) + '-' + right(col1, 2))) from tblDateTimeToConvert -- Insert the Data INSERT INTO [dbo].[tblDateTime] ([DT]) Select Convert(date, (left(col1, 4)+ '-' + substring(col1, 5, 2) + '-' + right(col1, 2))) from tblDateTimeToConvertNow, there's nothing wrong with doing the conversion this way. The brackets can get confusing when you add in time, but it works. Using DATEFROMPARTS, we can produce our converted date like this:
-- Look at the data Select DATEFROMPARTS(left(col1, 4), substring(col1, 5, 2), right(col1, 2)) From tblDateTimeToConvert --Insert it INSERT INTO [dbo].[tblDateTime] ([DT]) Select DATEFROMPARTS(left(col1, 4), substring(col1, 5, 2), right(col1, 2)) From tblDateTimeToConvertNow that's all fine and dandy, but in reading further on MSDN, we also get the new PARSE (and TRY_PARSE) statements. I decided to play with it. I realize this isn't necessarily the intended use, but just for fun, what would those statements look like? First off, I tried this:
Select try_parse(col1 as datetime) from tblDateTimeToConvertIt didn't work. Instead, it returned an error when using PARSE and NULLs with TRY_PARSE, so I knew something was up. I decided to actually read about the command and fed it my chopped up date, which worked to convert the data.
-- Review the data Select col1, TRY_Parse(left(col1, 4) + '-' + substring(col1, 5, 2) + '-' + right(col1, 2) as datetime2) from tblDateTimeToConvert --Insert the data INSERT INTO [dbo].[tblDateTime] ([DT]) Select TRY_Parse(left(col1, 4) + '-' + substring(col1, 5, 2) + '-' + right(col1, 2) as datetime2) from tblDateTimeToConvertAnd there you have it!
On this page
Share this
Share this
More resources
Learn more about Pythian by reading the following blogs and articles.
Data and analytics give businesses purpose

Data and analytics give businesses purpose
Sep 1, 2022 12:00:00 AM
2
min read
Datascape Episode 65: Recapping The 2022 Databricks Data and AI Summit With Luan Moreno Maciel

Datascape Episode 65: Recapping The 2022 Databricks Data and AI Summit With Luan Moreno Maciel
Sep 20, 2022 12:00:00 AM
1
min read
Datascape Episode 60: Data Strategy, Data Literacy, Monetization, and More with Joey Jablonski

Datascape Episode 60: Data Strategy, Data Literacy, Monetization, and More with Joey Jablonski
Jul 8, 2022 12:00:00 AM
1
min read
Ready to unlock value from your data?
With Pythian, you can accomplish your data transformation goals and more.