Jump to content

ukjadoon

New Members
  • Posts

    4
  • Joined

  • Last visited

ukjadoon's Achievements

Newbie

Newbie (1/5)

0

Reputation

  1. Thank you for the constructive reply. I am already working on an API and planning to use it with an Ionic App that could work both on iOS and Android as I already have experience with AngularJS. A picture of the day email subscription list is a really sweet idea! Yes, the UI could be better and I did have a social bar before but it slowed the page so I need to find something that wouldn't affect the loading times as much since usually you are at the mercy of external servers for how fast the script loads. Thanks a lot and sorry for the late reply!
  2. Sorry for the late reply! Yes, I already knew about these technologies as I read up a lot on Hacker News and always try to follow up what's new and trending! Using an external service for image manipulation (Cloudinary in this case) provides almost no CPU usage on my tiny server. This way I can keep the resources free when dynamically resizing images for mobile/desktop devices. This would be especially useful in the case of traffic spikes as otherwise, if the server also is responsible for image manipulation, the website would slow down to a crawl. Laravel Forge is a great way to set up pre-configured servers that are secure and snappy and are configured for push to deploy. This helped a lot with my Git workflow as I could deploy just by pushing to my Master branch without having to log into the server and manually running deploy scripts. Forge configured servers are also great for running queues that can take care of long running tasks. Import.io was great for crawling Reddit and Imgur. If I had to write my own crawling scripts to handle that it would take me a lot longer and it would also be harder to maintain and debug. Thus my philosophy is to use external API services for most of the heavy lifting as there is no need to reinvent the wheel, use what others have built and perfected already and build on top of that.
  3. Those are some good suggestions. However, I am already doing some of the things you mentioned. There are cron jobs set up for crawling, and those cron jobs parse the import.io crawlers I set up for Reddit and Imgur. The training of the crawlers was already done in my custom import.io scripts so I don't need to go through the hassle of dom/xpath crawling as the crawlers grab the image descriptions, paths and everything else I need. The idea of using the Cloudinary CDN was that I can manipulate images on the fly (like resizing, custom overlays etc.) without using my own server's CPU. Simply hotlinking an image would be bad for Imgur's bandwidth and it would also kill my server with additional resizing and manipulation responsibilities. The URLs for the images are unique indexes in the database and duplicates are ignored. The crawlers run every three hours and the content is updated on the site as well as on Twitter and Facebook using Zapier. Of course, the idea of the site seems like stealing as you said, which makes me kind of sad. I made it basically to collect all the cool posts that come over to Reddit everyday and are just lost in the course of time and are impossible to get back to and I just wanted a nice archive of the best posts posted each day. However, putting ads on the site was a way to pay for the servers and SASS costs that are involved in running the service. I know a lot of people over time might appreciate an archive of this kind, but is there a more "white hat" way of doing it? I put source links to the original Imgur posts on each page so in case people wanted to see the original post, they can just click on the source link and visit that where credit is due. Its not like the original posters can never be reached from my site and can never be found out, and its not like I am taking the entire credit for their pictures. If anyone has a problem with any of their images being posted here they can always ping the Twitter account or Facebook account and I'll take that image out, no questions asked. I could take out the ads and just shut down the service, I am not making any money out of it, I don't expect to either in the coming future as it is hard to generate unique content out of a site like this to attract organic traffic from search engines. I don't know what to do, I guess I'll take some time to decide over that. It was fun to build though and I enjoyed my time with it.
  4. Here is the source code on Github The website is located at TopCuteAnimals The idea was to collect the top posts from r/aww on Reddit and show them in an extremely simple to use responsive interface. Even though the site may seem very simple, being a backend architect, I think I over-engineered it. Here is the technology breakdown:- Laravel 5.0 A Digital Ocean micro server (512MB Ram) Laravel Forge setup Blackfire integration New Relic integration Papertrailapp integration for streaming error logs Beanstalkd for queued jobs. Zapier integration Pure CSS framework Cloudinary CDN and image manipulation MySQL Redis server Import.io to create a crawler API Bitbucket private repo I started out with having an idea that I should always show a random image when clicking on the "Show me another one!" button. However, I quickly ran into problems when people would see repeated posts as the MySQL RAND() function might return repeating post ids. Creating a more complicated algorithm to save all the posts in a cache that had already been viewed and also saved in a cookie created some unnecessary complexity. In the end, I ended up just showing the last posted image and the button basically takes you to the previous post every time you click it (or tap if using a touch device). I used Import.io to write a crawler API to fetch the top posts. Then I wrote some cron jobs to automate calling the API to fetch the latest posts every three hours. The actual job runs via a beanstalkd queue as a Laravel artisan command. I didn't want to hotlink the images directly from Imgur as I didn't want to leech on their bandwidth, so I used Cloudinary to display the pictures instead. One advantage was that with the Cloudinary API, I could resize the images and/or convert gifs to an MP4 on the fly (MP4 files take a lot less bandwidth to deliver). However, since I am not making any money from the site and wanted to stay on the free plan, I decided to mostly ignore gif files as they take a lot of bandwidth and disk space on Cloudinary. The Laravel queued command also takes care of uploading the images to Cloudinary. For a responsive design, I used Yahoo's Pure.css framework. It was fairly easy to work with and basically its just two Laravel blade view files that display the site's content (a master template and a view template). I also added a Twitter account, which can be found at TopCuteAnimals Twitter as well as a Facebook page which can be found at TopCuteAnimals Facebook Page I used a neat service called Zapier to post pictures from my site over to the Facebook and Twitter pages on an hourly basis. However, I didn't want to run out of the free plan for Zapier so now it is limited to just one update a day. All the scheduled cron jobs are handled via Laravel Forge which has honestly been a joy to work with. I have been using Git flow via Source tree and host my code in a private Git repository on Bitbucket. This helps with the fact that whenever I plan to make a release or a hotfix, each push to the master branch makes an automated deployment to the live site thanks to Laravel Forge's helpful integration. I also use Redis to cache the posts so as not to hammer my tiny server's MySQL database too much. Overall most of the site was done within a day with incremental updates that I have been adding in my free time over the past couple of weeks. What do you guys think and do you have any questions?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.