Integrating external web content into my app using server-side scraping

Hey everyone! I’m working on a personal project and I’m stuck. I want to show another website inside my Python web app. The problem is, the site I want to display doesn’t allow iframes.

I was thinking maybe I could use my server as a go-between. It would grab the pages from the other site and then send them to my app. But I’m not sure how to handle all the extra stuff like JavaScript and CSS.

I need some advice on how to make this work. What should I do on my app’s end? And what about on the server side? Has anyone tried something like this before?

I’m pretty new to web development, so any tips or suggestions would be super helpful. Thanks in advance!

I’ve encountered similar challenges in my projects. While server-side scraping can work, it’s important to consider the legal and ethical implications. Many websites have terms of service that prohibit scraping. Additionally, maintaining the functionality of a scraped site, especially with dynamic content, can be complex and resource-intensive.

Instead, I’d recommend exploring API options if available. Many sites offer APIs that provide structured data access. This approach is more reliable and usually sanctioned by the target site. If an API isn’t available, consider reaching out to the site owners for permission or collaboration. This could lead to a more sustainable and mutually beneficial solution for your project.

hey creativechef89, ive done smth similar b4. u could try using requests library to fetch the content, then beautifulsoup to parse it. but be careful, some sites dont like scraping. maybe look into selenium if u need js rendering. good luck with ur project!

hmm, interesting project! have u considered using a headless browser like Puppeteer? it can handle JavaScript and CSS. but watch out for legal stuff - some sites might not be cool with this. maybe try asking the site owners if they have an API? could be a win-win! what’s ur end goal with this project anyway?