Migrating historical logs to a new logging platform, such as Datadog, can be a challenging task, especially when dealing with both active and archived logs, but Observability Pipelines offers a seamless solution. The platform's dual-shipping capability allows active logs to be routed to the new platform without disrupting existing workflows, while historical logs, crucial for security investigations and compliance, require a tailored approach due to real-time data limits in many logging tools. Observability Pipelines facilitates this by enabling the routing of historical logs through customizable pipelines that can correct timestamps and forward logs to cloud storage for future rehydration. The process involves extracting logs from archives like Elasticsearch, Splunk, or cloud storage services such as Amazon S3, Google Cloud Storage, and Azure Blob Storage, using custom scripts and processing tools to transform and store them. The platform's Custom Processor can automatically adjust timestamps and apply transformations, ensuring logs are ready for long-term storage and eventual rehydration into Datadog for detailed analysis. Observability Pipelines supports various data processing features, including Grok parsing, JSON parsing, and sensitive data redaction, providing flexibility and control over log migration and management.