-
-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error 451 with https://ghapi.huchen.dev/ #130
Comments
Same here I believe Vercel provides free hosting for open source projects, have you applied to that? |
Unfortunately Vercel did not want to sponsor this project as it is API only. I could no longer find a good free host for the project. I suggest you clone the repo and deploy the code somewhere 😞 |
That's sad news :/ @github should definitely be sponsoring this or atleast provide an API of their own. |
@huchenme How about we add a simple frontend that makes use of the API, that wouldn't be breaching Vercel's TOS, right? |
I can try add frontend for this actually |
(unofficial) trending api is offline as per huchenme/github-trending-api#130 will deploy my own instance of the trending api
@huchenme I can give you the code of GitNews https://git.news if you wish. |
@sandoche I do have some code, let me add to the repo during nights and apply Vercel's sponsor again, if everything goes smooth it should be back in a week 🤞 |
I use this api for my personal website. Since it's having an issue, I cloned and hosted it. |
@akane10 that is great, may I know where did you host it? |
@huchenme yeah sure, it's on digitalocean. |
@akane10 I tried Netlify function and could not recall what did not work as expected. if you plan to keep your server, do you mind if I point |
@huchenme Yes, sure. I don't mind |
The issue I was facing for Netlify is the caching: If anyone has any luck with Netlify please let me know |
It is back now! the only downside is that it does not have browser cache now |
It quickly finished the free quota in Netlify, have to stop the project. |
Hmm, have you considered instead making a library that we can import and await and it will make the request and parse the request for us? That way you don't have to worry about hosting it, and we are able to use the code you wrote |
Hey hey for anyone that's reading on this, I had been using this library for quite a long time, definitely appreciate your work! @huchenme ! I had also created a self-hosted version of the scraper server ( with AWS lambda function compatible ), https://github.com/pupubird/get-github-trending And the demo site here https://hackertab.pupubird.com/repositories All credits go to @huchenme |
Also, by utilizing AWS CloudFront, I am able to cache the response with TTL to hourly, will write up a tutorial about it too! |
@waningflow will this ever go down? is there a limit ? |
@Smeet97Kathiria Deployed on Function Compute of Aliyun. About first 500k requests per month are free. |
@waningflow what happens after it hit the limits? |
@Smeet97Kathiria I suppose it's not likely to reach the limit for normal usage. If it does, I may restrict the apis temporarily. It's also recommended to deploy the service youself. Here is what I use in Function Compute of Aliyun, github-trending-api-node. Just |
hey @akane10 I tried out your url, and it seems https://gtrend.yapie.me/spoken_languages doesn't work? |
@hedythedev yeah, route |
Your problem is caused by re-computing the same exact data thousands of times for no good reason. In the cloud CPU is expensive, disk is practically free. Use functions to write static data in files, then route users to static files like it's 1995. Can you workaround this issue by creating an AWS hosted endpoint (trivially built with AWS Amplify -> Hosting)? Then you just need to add a Lambda function which calculates all required responses and stores the responses into static files to an S3 bucket. From this bucket you can serve static files as a web page by simply enabling public access to this static hosting (amplify hosting does that in the background for you). tl;dr: Create a trivial "web site" to set up free web hosting + create a Lambda function to update this same "web site's" files in the S3 bucket (fetch, parse, write json...). |
@andraz-at Instead of making the function store the response on S3, wouldn't it be simpler to cache all the function responses on a CDN e.g. AWS Cloudfront or Cloudflare? Serving from a CDN cache is even cheaper than serving directly from S3. Since these responses will only change once a day, you can set the TTL to be 86400 seconds (1 day). That way, it's very unlikely you run out of free function calls on most FAAS providers. If my memory serves me correctly, @huchenme was actually caching the responses when this project was being hosted on Vercel, it's a bummer that they suspended this project and his account as well :/ |
@andraz-at hey, I had already implemented caching on CloudFront site ( the cdn ), do feel free to directly deploy to your aws account using |
@huchenme My fork: https://github.com/wonderbeyond/github-trending-api/ |
Yes, there are alternative optimizations available. My point was that we can chain all those optimizations one after another in a way to fit all requests into the free tier limit of AWS (preferably the permanent one, not the 12 months one). Alternative would be to use AWS lambda periodically (CRON event) only for processing the data, then push the hosting on GitHub pages / GitHub gists by programmatically updating those files remotely. This would also be a nice way of making GitHub host their own feed by serving static files. 🙂 We can also do both, to spread the load. But premature optimization is a slippery slope to stop doing anything at all, so we better skip digging deeper if not needed. |
@waningflow are both of your urls down? |
Hi, @pupubird can I route my app https://apps.apple.com/us/app/superrepo/id1517331914 to your site? |
Hey @Smeet97Kathiria sure! |
Cuz I am using it for the unofficial hackertab huchenme/hacker-tab-extension#55 too, so you are free to use as well |
@pupubird Thanks |
welcome @Smeet97Kathiria |
This looks pretty good! Good work! |
Guys feel free to use mine! http://192.168.1.33:8000/ Thanks |
The text was updated successfully, but these errors were encountered: