Tutorial: How I created the Multiple Remote Posting on Twoplustwo Message Board App

This app allows the following features:

  • Allows you to click a button to initiate a web scraper that will scrap all of the latest topics from three of www.twoplustwo.com forum categories: politics, sports and other other topics.
  • Allows you to make a post in multiple www.twoplustwo.com topics through a single form.

This is a Ruby on Rails app that utilizes web scraping (gem “mechanize”), json parsing (gem ‘httparty’), rake task via cron job (gem “whenever”), delayed jobs via ActiveJob (gem “delayed_job_active_record”).

Here is the github repository.

This tutorial assumes that you have working knowledge of ruby on rails.

Let’s get started by creating a new Cloud9 IDE ruby workspace and creating the models for our three categories and running rake db:migrate

Now go into Rails Console and create a user with your twoplustwo account username ( :tpt_un ) and twoplustwo password ( :tpt_pw ).

Now let’s add ‘gem mechanize’ and 'gem httparty' to our Gemfile, bundle install and build the web scraper:

Let’s create a rake task.

So what is happening here is that we have created a method topics that will be passed 5 parameters. We will first instantiate a new scraper object that we will use by way of the mechanize gem (which itself uses the nokigiri gem internally) to travel to these web pages and retrieve the data. We will instantiate new objects (which will be our new politics/other topics/sports topics) in a loop. The download and download_two are class methods located inside of our category models.

In case you were wondering,first_topic and last_topic parameters are different for each model because each category has a different number of sticky topics that we wish to avoid.

These methods are going to use the data passed from our scraper’s get_title and get_url methods to edit our database changing each objects url and title attribute.

We can run this rake task from Rails Console with rake twoplustwo:topics .

After creating some routes we will bring ActiveJob into the mix. We can also go ahead and add gem ‘delayed_job_active_record’ to our Gemfile and run bundle install . Go to the github page for delayed_job_active_record and follow the installation instructions.

We are now going to create a PoliticsController and create our ActiveJob

https://gist.github.com/SeanJManning/f0ceea603fd3cc360573eee98d5c1b40

Now we can create a link in the politics/index view:

<%= link_to ‘Run Scraper on Topics’, ‘/topics’, remote: true %>

This triggers the ohsnap action in our politics controller which immediately passes it off it to the RefreshTopicsJob.

We can then alert the user to what is going on using ajax in our ohsnap.js.erb file $(‘.message’).html(‘We are recreating your topics now!’).addClass(‘green’);

RefreshTopicsJob makes a system call to run rake twoplustwo:topics and does so in the background using delayed_job_active_records

We now need to go into Rails Console and run rake jobs:work and all of the queued jobs that we have called will begin running. This is the amazing power of the Ruby on Rails community at your finger tips!

So now, just by clicking that link in our view, we can scrap topics from the twoplustwo message board.

Let’s display those topics and lets create a message model, message controller and a form so we can start posting remotely in multiple topics.

Here you will notice topic_list which is located inside of the MessagesHelper module.

In this method, we instantiate a new hash object and then fill it up with the url and title data from our models so that we can create a collection of all topics to post into. The hash keys are are url and title and the values are the objects url and title. We will then make selections based off of this key value pairing and our selected choices will be handed to our MessagesController when we submit our form and then serialized in an Array by our model.

Serialize :url, Array is going to save our url attribute as an array, so that we can select multiple topics from the select in our form. However, since we wish to limit those urls to three topics to post into, we will then use the before_validation callback on the create action.

self.url = url.take(3)

This cuts down our url array to the first three urls selected before it is saved.

So now we are ready to submit a message string and to select the topics to post that message to. Once again, we are passing this off to Activejob and delayed_job so that it can be handled in the background — otherwise, we would have to wait for each topic to be posted — a long pause.

Now that we have handed this job off to PostCreationJob, it will call the post_it instance method from the Message class and pass it the required parameters.

Once again, we are instantiating a new scraper object to go to work for us and passing it the required parameters. It will now login to www.twoplustwo.com using the user_id (which we passed as a hidden value in our form) to pull the twoplustwo username and password we created at the beginning of the tutorial.

and then the scraper will start making posts for us as we assigned:

Back in the MessagesController, we have created a flash message (this will require additional code in your layouts and views) to pass links to the three topics that the user selected. See ternary operator.

Awesome!

This is just the starting point! You can do all kinds of awesome things from here!

How about hooking up ActionCable so that we can see topics being created in real time and deliver messages to our user with the success callback of delayed_jobs?

Or we can use the whenever gem to create a cron task and pull topics at a predefined time schedule.

Or maybe let’s start pulling the actual data from the posts on that message board and do some analytics or machine learning!

But what I would really like is for you to hire me.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.