Transferring Data from AWS Postgres to Local SQL Server using ETL

I need to transform an existing SSIS package that loads JSON data from a Postgres database hosted on AWS (part of my CRM system) into a local SQL Server for reporting purposes. The JSON should be converted into a structured format suitable for the SQL database. Although my Python knowledge is limited, I view this as a valuable opportunity to enhance my skills. Can anyone recommend effective tools and libraries for this task? Are there any well-respected Python ETL developers or resources I should follow for best practices? I’m particularly interested in community-driven insights rather than vendor materials, as I’ve already read conflicting information online. Currently, I have a simple Python script set up using ODBC drivers for Postgres, but I would really appreciate guidance on establishing good practices from experts in the field.

hey ClimbingMonkey! Isn’t it exciting exploring DIY solutions with Python? Have you tried using libraries like SQLAlchemy or Pandas for structuring JSON? They’re quite community driven. Also, have you considered any communities or online courses that focus on Python and data transformation? I’d love to hear about your experince! :slight_smile:

For your ETL process, you may consider using Apache Airflow as an orchestrator since it’s quite popular in managing workflows including data transfers and transformations. Additionally, for transforming JSON to a structured format, you might find the json_normalize function in Pandas quite useful as it flattens JSON data. It’s worthwhile checking out some open-source repositories on GitHub; they provide practical insights and real-world applications from experienced developers. These resources can be invaluable as you scale your skillset.

hey there! Have you thought about using DBeaver? it’s really helpful for visualizing transformations, plus it supports PostgreSQL and SQL Server. Also, reading blogs by people sharing real-world etl stories might spark some ideas. sometimes it’s these little tips that make all the difference! Good luck!

Hey ClimbingMonkey, trying out new stuff on python sounds fun! have you looked into data wrangling with pandas? it’s super versatile. What obstacles do you expect when handling the JSON data? Wondering if diving into some Python meetups or webinars might open new perspectives for you. what’s your biggest curiosity now?

For a seamless data transfer from AWS Postgres to a local SQL Server, you might want to explore using psycopg2 combined with SQLAlchemy. Psycopg2 provides robust PostgreSQL connection functionalities, while SQLAlchemy can assist in mapping the JSON data into SQL tables efficiently. Additionally, understanding how to use Pandas for batch data processing will be beneficial, especially when working with JSON data. Continuously following developers on platforms like Stack Overflow and GitHub can provide practical advice and examples, which are crucial when learning new skills in Python.