This is a robust RSS scraper service built with Golang. It is designed to efficiently aggregate and manage RSS feeds from various sources, allowing users to easily access and consume content from their favorite websites.
- Multi-Feed Support: Scrape and aggregate feeds from multiple sources simultaneously.
- RSS Feed Parsing: Efficiently parse and aggregate RSS feeds.
- Database Integration: Store scraped data in a PostgreSQL database for persistent access and management.
- Concurrency: Utilize Golang's concurrency model for faster processing.
- Error Handling: Robust error handling for network and parsing errors.
- Logging: Comprehensive logging for monitoring and debugging.
To get started with this project, you can clone the repository and build the project:
git clone https://github.com/haroonalbar/rss-aggregater
cd rss-aggregater
go build
Set up .env from .env.example Set your db url in .env
mv .env.example .env
Migrate DB using goose
cd sql/schema
goose postgres <connection-url> up
Once the project is built, you can run the service:
./rss-aggregater
The service will start and begin scraping RSS feeds as configured.
- Golang: The core programming language used for building the service.
- PostgreSQL: Database for storing feed and post data.
- Goose: Goose is a database migration tool Resource Sharing.
- Sqlc: sqlc generates type-safe code from SQL
- Chi Router: Lightweight, idiomatic and composable router for building Go HTTP services.
- UUID: For generating unique identifiers for feeds and posts.
- CORS: Middleware for handling Cross-Origin Resource Sharing.