<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=214761&amp;fmt=gif">
Skip to the main content.
Why TimeXtender?

Every day we work to make the complex simple, to automate all that can be and to execute on what matters most.

Let us show you the Magic

Watch a Demo


See how others are building their data estates with TimeXtender, and learn how to follow in their footsteps.

  What's new?

Hand-Coding Data Pipelines vs TimeXtender

Cost Savings Calculator →

  Contact Us

We’d love to have a deeper discussion with you about your questions.

  Who We Are

Our core purpose is to empower the world with data, mind, and heart.

  Our Culture

We believe the “we are what we give” and that “people do business with people”.

  Join Us

We are a global, distributed workforce, organized in self-managed teams.

1 min read

Use Azure Data Factory to Ingest On-Premises Data

Featured Image

TimeXtender Tuesday - Episode 4

This topic in our series will cover using Azure Data Factory (ADF) to ingest on-premise data to the ODX Storage (ADLS Gen2). If you'd like to review the previous steps so you can follow along, check out our TimeXtender Tuesday playlist on YouTube.

If you have a on-premise data and need to push it to your data lake, Azure Data Factory's Self-Hosted Integration Runtime (IR) is a recommended way to do exactly that. Joseph Treadwell will walk-through the process of setting up the ADF resource and Self-Hosted IR and then ingest data to the data lake, ODX Storage.

  1. Create an Azure Data Factory resource (ADF)
  2. Enable App Registration to access ADF resource
  3. Setup Self-Hosted Integration Runtime (IR) for On-Premise data
  4. Add on-premise data source to ODX Storage

Mentioned Links:

TimeXtender Tuesday Links:

Taking it further:

6 min read

10 Ways TimeXtender's Partner Program Helps You Work Smarter

14 min read

The Top 10 AI Art Projects Of 2022

2 min read

We are data: Interviewing Rory Smith, Managing consultant at E-mergo