Jump to content

I made a website with Laravel 5.0 and a bunch of other technologies over a weekend (sourcecode on Github)


ukjadoon

Recommended Posts

Here is the source code on Github


 


The website is located at TopCuteAnimals The idea was to collect the top posts from r/aww on Reddit and show them in an extremely simple to use responsive interface. Even though the site may seem very simple, being a backend architect, I think I over-engineered it. Here is the technology breakdown:-


  • Laravel 5.0
  • A Digital Ocean micro server (512MB Ram)
  • Laravel Forge setup
  • Blackfire integration
  • New Relic integration
  • Papertrailapp integration for streaming error logs
  • Beanstalkd for queued jobs.
  • Zapier integration
  • Pure CSS framework
  • Cloudinary CDN and image manipulation
  • MySQL
  • Redis server
  • Import.io to create a crawler API
  • Bitbucket private repo

I started out with having an idea that I should always show a random image when clicking on the "Show me another one!" button. However, I quickly ran into problems when people would see repeated posts as the MySQL RAND() function might return repeating post ids. Creating a more complicated algorithm to save all the posts in a cache that had already been viewed and also saved in a cookie created some unnecessary complexity. In the end, I ended up just showing the last posted image and the button basically takes you to the previous post every time you click it (or tap if using a touch device).


I used Import.io to write a crawler API to fetch the top posts. Then I wrote some cron jobs to automate calling the API to fetch the latest posts every three hours. The actual job runs via a beanstalkd queue as a Laravel artisan command.


I didn't want to hotlink the images directly from Imgur as I didn't want to leech on their bandwidth, so I used Cloudinary to display the pictures instead. One advantage was that with the Cloudinary API, I could resize the images and/or convert gifs to an MP4 on the fly (MP4 files take a lot less bandwidth to deliver). However, since I am not making any money from the site and wanted to stay on the free plan, I decided to mostly ignore gif files as they take a lot of bandwidth and disk space on Cloudinary. The Laravel queued command also takes care of uploading the images to Cloudinary.


For a responsive design, I used Yahoo's Pure.css framework. It was fairly easy to work with and basically its just two Laravel blade view files that display the site's content (a master template and a view template).


I also added a Twitter account, which can be found at TopCuteAnimals Twitter as well as a Facebook page which can be found at TopCuteAnimals Facebook Page


I used a neat service called Zapier to post pictures from my site over to the Facebook and Twitter pages on an hourly basis. However, I didn't want to run out of the free plan for Zapier so now it is limited to just one update a day.


All the scheduled cron jobs are handled via Laravel Forge which has honestly been a joy to work with. I have been using Git flow via Source tree and host my code in a private Git repository on Bitbucket. This helps with the fact that whenever I plan to make a release or a hotfix, each push to the master branch makes an automated deployment to the live site thanks to Laravel Forge's helpful integration. I also use Redis to cache the posts so as not to hammer my tiny server's MySQL database too much. Overall most of the site was done within a day with incremental updates that I have been adding in my free time over the past couple of weeks.


 


What do you guys think and do you have any questions?


Link to comment
Share on other sites

 

I think I over-engineered it. Here is the technology breakdown:-

  • Laravel 5.0
  • A Digital Ocean micro server (512MB Ram)
  • Laravel Forge setup
  • Blackfire integration
  • New Relic integration
  • Papertrailapp integration for streaming error logs
  • Beanstalkd for queued jobs.
  • Zapier integration
  • Pure CSS framework
  • Cloudinary CDN and image manipulation
  • MySQL
  • Redis server
  • Import.io to create a crawler API
  • Bitbucket private repo

I'd say so, lots of work to show a random image on a page with 2 buttons.

 

To copy and host them is like image theft.

loading a huge image isn't fun for many, I would hotlink from imgurl

 

If I was to do such a project:

simple image scraper using curl and php, most likely using dom and domxpath

set a cron job to run the script

scrape hyperlinks/alternately image tags and (match imgurl pattern if just want those, extract just the codes and normalize links the same for all imgurl)

store full href links to expand upon this a day to other sites

set a unique index on the url to prevent duplicates

If the site scraping has adequate data just use their titles and or descriptions, otherwise go to the image source pages and obtain that data. In this case imgurl has all the opengraph meta data.

I would hotlink from imgurl, cache smaller thumbnails and do a gallery in pagination or some lazy loader jquery or js/ajax

If cached image exists load that...otherwise scale the image down or merely use css for a max width or height as a thumbnail.

The frontend would be a mobile css design and more on the page than just a gallery.

Edited by QuickOldCar
Link to comment
Share on other sites

I'd say so, lots of work to show a random image on a page with 2 buttons.

 

To copy and host them is like image theft.

loading a huge image isn't fun for many, I would hotlink from imgurl

 

If I was to do such a project:

simple image scraper using curl and php, most likely using dom and domxpath

set a cron job to run the script

scrape hyperlinks/alternately image tags and (match imgurl pattern if just want those, extract just the codes and normalize links the same for all imgurl)

store full href links to expand upon this a day to other sites

set a unique index on the url to prevent duplicates

If the site scraping has adequate data just use their titles and or descriptions, otherwise go to the image source pages and obtain that data. In this case imgurl has all the opengraph meta data.

I would hotlink from imgurl, cache smaller thumbnails and do a gallery in pagination or some lazy loader jquery or js/ajax

If cached image exists load that...otherwise scale the image down or merely use css for a max width or height as a thumbnail.

The frontend would be a mobile css design and more on the page than just a gallery.

Those are some good suggestions. However, I am already doing some of the things you mentioned. There are cron jobs set up for crawling, and those cron jobs parse the import.io crawlers I set up for Reddit and Imgur. The training of the crawlers was already done in my custom import.io scripts so I don't need to go through the hassle of dom/xpath crawling as the crawlers grab the image descriptions, paths and everything else I need. The idea of using the Cloudinary CDN was that I can manipulate images on the fly (like resizing, custom overlays etc.) without using my own server's CPU. Simply hotlinking an image would be bad for Imgur's bandwidth and it would also kill my server with additional resizing and manipulation responsibilities. The URLs for the images are unique indexes in the database and duplicates are ignored. The crawlers run every three hours and the content is updated on the site as well as on Twitter and Facebook using Zapier.

 

Of course, the idea of the site seems like stealing as you said, which makes me kind of sad. I made it basically to collect all the cool posts that come over to Reddit everyday and are just lost in the course of time and are impossible to get back to and I just wanted a nice archive of the best posts posted each day. However, putting ads on the site was a way to pay for the servers and SASS costs that are involved in running the service. I know a lot of people over time might appreciate an archive of this kind, but is there a more "white hat" way of doing it? I put source links to the original Imgur posts on each page so in case people wanted to see the original post, they can just click on the source link and visit that where credit is due. Its not like the original posters can never be reached from my site and can never be found out, and its not like I am taking the entire credit for their pictures. If anyone has a problem with any of their images being posted here they can always ping the Twitter account or Facebook account and I'll take that image out, no questions asked.

 

I could take out the ads and just shut down the service, I am not making any money out of it, I don't expect to either in the coming future as it is hard to generate unique content out of a site like this to attract organic traffic from search engines. I don't know what to do, I guess I'll take some time to decide over that. It was fun to build though and I enjoyed my time with it.

Link to comment
Share on other sites

  • 1 month later...

You took a clean simple idea and delivered a solid working system in a short time frame.  The end result is well thought out.

 

This project demonstrates that you are a very flexible developer with a lot of domain knowledge in a number of hot areas, from PaaS to clouds api usage, to git, to Devops, to the Laravel framework and it's ecosystem, etc etc.  It's a nice portfolio piece, and something you can continue to tweak and work on in your spare time.

 

We could quibble with individual decisions you made, but that's true of every system I've ever been involved with.  

 

Logical extensions?

 

Your sharing/social bar could be more effective.

 

The UI could be a little prettier.  

 

Add a Rest API, and create some mobile apps.

 

Allow people to comment.

 

Add a mail list subscription feature and email out a "picture of the day" with ad.

 

 

Really there are a lot of things you can do with this to add to it that will allow you to keep experimenting.  If more people took the approach you have, there would be a lot more competent developers in the world.

Link to comment
Share on other sites

The only thing I'd add to this is, in general, all text and images that are hosted on a website are copyrighted by the site owner/company of that website, assuming they are original pieces of work. Even if a user uploads content to that website, such as a photograph they took, that legally now belongs to the website they uploaded it to do what they wish with. This is an implied legal copyright, you don't even have to state it anywhere on the website although most sites do have this in some legal section, terms of use, etc. I'm not a lawyer, but this is my understanding.

 

This is from PHPFreaks own site:

You agree that by creating any content using the Service that you will grant the Company a perpetual, irrevocable, world-wide, transferable, non-exclusive, royalty free license to use said content.

 

 

If you want to display another copyright holders works on your site, you need to get their written permission or you can face legal action for copyright infringement if they find out and pursue you. Some websites also do have APIs that you can use to display their content without violating any law, assuming you use it within their terms of service.

Edited by CroNiX
Link to comment
Share on other sites

  • 1 month later...

Very nice use of SaaS. A lot of services I hadn't even heard about. Did you just google these, or already knew them? And why exactly did you decide to use these?

Sorry for the late reply!

 

Yes, I already knew about these technologies as I read up a lot on Hacker News and always try to follow up what's new and trending! Using an external service for image manipulation (Cloudinary in this case) provides almost no CPU usage on my tiny server. This way I can keep the resources free when dynamically resizing images for mobile/desktop devices. This would be especially useful in the case of traffic spikes as otherwise, if the server also is responsible for image manipulation, the website would slow down to a crawl. Laravel Forge is a great way to set up pre-configured servers that are secure and snappy and are configured for push to deploy. This helped a lot with my Git workflow as I could deploy just by pushing to my Master branch without having to log into the server and manually running deploy scripts. Forge configured servers are also great for running queues that can take care of long running tasks. Import.io was great for crawling Reddit and Imgur. If I had to write my own crawling scripts to handle that it would take me a lot longer and it would also be harder to maintain and debug. Thus my philosophy is to use external API services for most of the heavy lifting as there is no need to reinvent the wheel, use what others have built and perfected already and build on top of that.

Link to comment
Share on other sites

You took a clean simple idea and delivered a solid working system in a short time frame.  The end result is well thought out.

 

This project demonstrates that you are a very flexible developer with a lot of domain knowledge in a number of hot areas, from PaaS to clouds api usage, to git, to Devops, to the Laravel framework and it's ecosystem, etc etc.  It's a nice portfolio piece, and something you can continue to tweak and work on in your spare time.

 

We could quibble with individual decisions you made, but that's true of every system I've ever been involved with.  

 

Logical extensions?

 

Your sharing/social bar could be more effective.

 

The UI could be a little prettier.  

 

Add a Rest API, and create some mobile apps.

 

Allow people to comment.

 

Add a mail list subscription feature and email out a "picture of the day" with ad.

 

 

Really there are a lot of things you can do with this to add to it that will allow you to keep experimenting.  If more people took the approach you have, there would be a lot more competent developers in the world.

Thank you for the constructive reply. I am already working on an API and planning to use it with an Ionic App that could work both on iOS and Android as I already have experience with AngularJS. A picture of the day email subscription list is a really sweet idea! Yes, the UI could be better and I did have a social bar before but it slowed the page so I need to find something that wouldn't affect the loading times as much since usually you are at the mercy of external servers for how fast the script loads. Thanks a lot and sorry for the late reply!

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.