You can leverage ADF’s parameters feature with Mapping Data Flows to create pipelines that dynamically create new target tables. You can set those table names through Lookups or other activities. I’ve written a very simply post below on the tools you’ll need to do this:
- Create a new ADF pipeline.
- Switch on “Data Flow Debug”.
- Create 2 new datasets. We’ll use Azure SQL DB for this demo, so create 2 Azure SQL DB datasets.
- Go to the Parameter tab in the dataset and add a parameter for “tablename” as a sring.
- In both of the datasets, do not select a table name. Instead, leave the table name blank and click in the table name field to display the “Add Dynamic Content” link.
- Select the “tablename” parameter for the value in the Table field.
- You’ll do this for both datasets. We’ll use 1 dataset for the source and the other for the sink destination.
- Add an Execute Data Flow activity to your pipeline canvas and create a new data flow.
- Inside the data flow canvas, select the dataset for the source table.
- Add a Sink transformation directly after the source.
- Choose the dataset for the destination table.
- Back in the pipeline, click on the Execute Data Flow activity.
- In the Settings tab, you’ll see a prompt for the values for the incoming and outgoing table names.
- For this demo, I just typed in static text. I have an existing table in my SQL DB called “dbo.batting1”. I want to copy it as “dbo.batting2”.
- This pipeline will copy it as a different name in the same database.
- In a real-world scenario, you will set these dataset parameters via values from Lookup or other activities that change dynamically.
- Click on “Debug” to test your pipeline.
- After the debug run is executed, you should now see a new table in your Azure SQL DB with the name that you provided in the 2nd dataset parameter.