
# Ingest
Simon can ingest your data through a lot of different channels, unifying your data for a single view of your customer. We ingest data in two primary ways:
**Batch**: Data is ingested once a day from data warehouses and other third party vendors.
**Real-time**: We stream data in via first and/or third party tools.
You can combine multiple batch and real-time sources to _get_ that single view of your customer.
We have some in-house tools we use to gather event data, primarily through Simon Signal. We can also leverage webhooks to pull events from nearly any source you offer that you can then use to create the triggers for your marketing campaigns. You can also ingest from a warehouse, from files, Snowflake shares, streaming sources etc. Some of the use cases are pretty specific, but some are based on your preferred methods and current data sources.
Data ingested upstream is used to trigger personalized marketing campaigns downstream, delivered to customers via many channel options. For example, we may ingest data about your customer's shopping preferences, carts, demographics, etc. via a combination of your database, webhooks, and Simon Signal data. Your end user will then use that data in the Simon platform to create a campaign based on multiple triggers sending your customers personalized messages that drastically increase sales.
Here is one sample architecture:

<!--
## Ingestion SourcesSome ingestion methods are pretty straightforward like data warehouses and email service providers (ESP). Some use cases need a little more explanation, like how to ingest analytics data or drop files to Simon on a regular cadence. The documents included in the following sections (Batch Ingestion and Real-Time Ingestion) walk you through each method. ## Ingestion Speeds| Ingest Source | Dataset | Speed | Notes |\| ------------- \| --------------------------------------------------------------------- \| -------------------------------------- \| ---------------------------------------------------------------------------------------------------------------------------------- \|| Database | Anything not event-triggered | 1-3x/daily (requires pipe run) | |
| Database | Event-triggered | Real-time (~5min lag) | | | Event stream | Event-triggered | Real-time (~5min lag) | | | S3 | Recurring file feed | 1x daily (requires pipe run) | Data is loaded from S3 into Snowflake every 30 mins, but the data must run through Simon's data pipe to be useable in the platform | | S3 | Recurring file feed - contact event ("events" tab in segment builder) | 1-2 hour delay (requires sb2 pipe run) | | | CSV | Immediate use without any additional data from Simon | Real-time | | | CSV | Blend with additional data from Simon | 1-3x/daily (requires pipe run) |
-->
# Egress
Many ingestion sources are also used when you want to push data back out to your customer or analytics, but they're configured separately from ingestion. Common export sources are files, data feeds, etc. Methods and details are provided in the Outbound section.