Redux From Scratch (Chapter 7 | Building an API Service With Express.js & MongoDB)

Michael Mangialardi
Coding Artist
Published in
27 min readJul 10, 2017

Buy the Official Ebook

If you would like to support the author and receive a PDF, EPUB, and/or MOBI copy of the book, please purchase the official ebook.

Prerequisites

Chapter 1 | Core Concepts
Chapter 2 | Practicing the Basics
Chapter 3 | Implementing With React
Chapter 4 | Async Practice With Twitch API
Chapter 5 | Implementing Middleware
Chapter 6 | React, Redux, Firebase Stack

Reviewing the Big Picture

In the previous chapter, we went through the process of setting up Firebase so we could have a database without writing any server-side code. We were able to the Redux model in order to fetch data from Firebase, update our state, and render different things to the user interface. While we didn’t go over it explicitly, being able to write to a Firebase database was only a step away given all the information we have accumulated.

Currently, we are doing no server-side stuff on our own. We have created React/Redux applications that are being hosted by webpack-dev-server. We haven’t been interested in what it is doing under-the-hood. We simply installed it as a development dependency and have it running on an npm start command:

"scripts": {
"start": "webpack-dev-server"
},

To our knowledge, it is just taking our bundled React/Redux application and running it on a local host address:

In our minds, we are probably envisioning the architecture (big-picture) of our application like this:

We have our React/Redux application. In this application, Redux can handle fetching data via APIs and updating the state held within a store. The React container components can actively seek out the data in the state and can pass down data as props to React presentational component to render a user interface. The completed React/Redux application gets processed by Webpack and the final (bundled) application gets hosted on Webpack Dev Server.

We can go to a local host address and see our web application:

We could make awesome, dynamic web applications with this architecture and not worry about server-side code. If I decided to launch my own startup, I would probably go this route.

However, larger scale projects fit better with a more robust architecture where the server-side code is managed. We will discuss this more later on.

In short, by writing our own server-side code, we will get more power. But…

Understanding Full-Stack Architecture

We have the following questions to answer in this section:

  • What server-side code will we be writing?
  • What tools do we need in order to do that?
  • Why exactly is this beneficial?

Currently, interacting with a Firebase database has occurred on the client-side via the Firebase API:

For example, we could read data with the Firebase API like so:

import * as firebase from "firebase";const database = firebase.database();
const books = database.ref('Books');
const authors = database.ref('Authors');
books.on('value', function(snapshot) {
//do something with values in snapshot of db at reference
});

In the code above, we just have to import Firebase and then we can do API stuff like firebase.database() , database.ref(‘…’) , books.on(‘value’, function(snapshot) { …}) without much consideration as to how we were able to make these API requests.

The same was true when interacting with the Twitch API:

There was an existing API from Twitch that would give us data when requested. We used Axios to make these requests:

//API request
axios.get('https://api.twitch.tv/kraken/streams/featured?&client_id=skawlpb80ixx8e9cxafxepbn66xhe1')
.then(response => {
//do something with response
})

In other words, we’ve been communicating with an established API. There’s been a bridge between our client-side and fetching for data somewhere else.

What if we want to use a database like MongoDB but there’s no established bridge between the client-side and the database?

What do we do then?

Well, we need to write our own server-side code to establish an API service so we can create, read, update, and delete from our database.

Ok…but why all the extra effort to switch to MongoDB when Firebase has an established API?

That’s a great question and too often it’s not addressed. Let’s take the time to answer it carefully.

At the very least, switching to MongoDB requires us to write our own server-side code, and specifically, our own API service. It will force us to learn Node.js and creating our own API. This is a valuable exercise to learn full-stack JavaScript development. Even if we don’t need to write our own server-side code to use the Firebase database, there could be possibilities where server-side code is needed for something else. It does not hurt to treat this as an exercise for learning even if not the best option for a use case that concerns you.

Going off of that, it’s much more likely that businesses are working with server-side code and MongoDB (or any other full-fledged database) than Firebase.

The general consensus seems to be that Firebase is a totally legitimate solution. It has a lower learning curve and doesn’t require as many hands deck to manage server-side code. However, it is better suited for smaller projects and startups than larger projects and businesses.

MongoDB is meant to be a full-fledged database. It has more features, allows for more control, can be deployed own your own infrastructure (physical or virtual), and has many years of experience and implementation behind it.

If we are looking it from the angle of what is the most convenient for the aspiring JavaScript developer, the answer is Firebase. If we are looking it from the angle that an established business would look at it, then their answer is most likely going to be MongoDB because it is more stable and lowers risk.

To summarize, I think learning how to write our own server-side code by switching to MongoDB is a good learning experience and is not a waste of time. I’m not qualified to make definitive claims to tell you what you should use going forward, nor do I think everything is so dang black and white. Developers like to be overly opinionated and regurgitate whatever they see other people saying. My brief analysis is to an end-all-be-all, but it is enough to say….let’s give MongoDB and server-side code a whirl!

Ok…I see why going forward with MongoDB and server-side code is worth my time. Although, what server-side code do we need to write and what tools do we need to use to make this happen?

Node.js allows us to write server-side applications in JavaScript.

Express.js is a Node.js framework that leverages off the foundation of Node.js (just as React leverages off the foundation of JavaScript) in order to make it easier for us to write robust APIs with good performance. We will be using Express.js to write our own API that will allow us to communicate with an external database and do CRUD (create, read, update, delete) operations.

Fun fact: Webpack Dev Server is actually a small Node.js Express server.

MongoDB will be the external database that we will interact with as already mentioned.

The React/Redux implementation will stay exactly the same except that the server-side code will be handling interactions with the database. The data from the database will still be used to update the state managed by Redux.

We can know think of the big picture of our application like this:

With all of this explaining out of the way, let’s get started on creating our API service for MongoDB via Express.js.

Setting Up Express.js

First off, retrieve this template from GitHub. This template is going to be the shell of the ReactReduxTwitch project we created. It will ultimately have the same functionality except it will retrieve data from our MongoDB database and not from the Twitch API.

Rendering Our React/Redux App Via the Express Server

The first thing we want to do is setup Express.js, which again, is a Node.js framework so we can write server side code.

We are going to be using the Express application generator. You can see all the official documentation here.

The Express Generator is a command line tool that will help us initialize a new Express app quite easily. Install this command line tool:

npm install express-generator -g

Then, open up our new project in a code editor and rename the package.json file to oldpacakage.json.

The generator creates the following files for us:

As you can see, the only duplicate from our project template is package.json hence why we renamed the current one. When this runs, we will merge the two into one package.json file and install the dependencies.

Let’s run the following command to fire the generator:

express --git

As you can see from the options, --git adds a .gitignore file so we avoid uploading unnecessary stuff to GitHub.

The combined package.json file should look like this:

{
"name": "reactreduxexpressmongo",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"start": "node ./bin/www"
},
"author": "",
"license": "ISC",
"dependencies": {
"react": "^15.6.1",
"react-dom": "^15.6.1",
"react-redux": "^5.0.5",
"redux": "^3.7.1",
"redux-logger": "^3.0.6",
"redux-thunk": "^2.2.0",
"body-parser": "~1.17.1",
"cookie-parser": "~1.4.3",
"debug": "~2.6.3",
"express": "~4.15.2",
"jade": "~1.11.0",
"morgan": "~1.8.1",
"serve-favicon": "~2.4.2"
},
"devDependencies": {
"babel-core": "^6.25.0",
"babel-loader": "^7.1.1",
"babel-preset-env": "^1.5.2",
"babel-preset-react": "^6.24.1",
"html-webpack-plugin": "^2.29.0",
"webpack": "^3.0.0",
"webpack-dev-server": "^2.5.0"
}
}

We combine the dependencies, but we also update the npm start script to the following:

"scripts": {
"start": "node ./bin/www"
},

To explain this, take a look at the new app.js and www file (bin/www). App.js contains all the routes and modules for our application. While it looks like a lot, you can think of it as specifying the files that we want to serve to the browser. The www creates an HTTP server and places our application on a port. By default, the port is 3000 so we can view our application on localhost:3000 when we run npm start .

For now, let’s run npm install to install all of our dependencies for this project.

In app.js, there is a line of code that is going to serve every static file found in the public folder. Remember, Webpack is bundling our code and injecting it into index.html. Therefore, we can have our Express server host our React/Redux application by simply moving the index.html file under the public directory and updating the index.html location in webpack.config.js:

const HtmlWebpackPlugin = require('html-webpack-plugin');
const HtmlWebpackPluginConfig = new HtmlWebpackPlugin({
template: './public/index.html',
filename: 'index.html',
inject: 'body'
})

Back in app.js there is this thing called view engine setup which we can comment out:

// view engine setup
// app.set('views', path.join(__dirname, 'views'));
// app.set('view engine', 'jade');

We can also delete the following lines:

app.use('/', index);
app.use('/users', users);

Additionally, delete the views folder from the root of our project folder.

There are four more things before we should run npm start and test if this is working.

First, under public/stylesheets, rename the style.css file to main.css. Copy and paste the content of the styles/main.css file into here. You can then remove the entire styles folder. After that, update the following lines in index.html:

<title>React, Redux, Express, Mongo</title>
<link rel="stylesheet" href="stylesheets/main.css">

Next, let’s add a message right above where our React app should be rendering in:

<body>
<h1>Hello Express.js App</h1>
<div id="app">
</div>
</body>

After that, delete the entire routes folder and the following imports of them found in app.js:

var index = require('./routes/index');
var users = require('./routes/users');

Finally, delete the public/images and public/javascripts folders.

Run npm start and check the local host: http://localhost:3000/ :

Our message is being rendered. This means that our Express.js server is properly hosting the index.html file on our local host.

However, our React application is not being rendered.

We know this because the reducer has the following initial state:

//define the initial state
const initialState = {
status: "loading",
streams: [0,1,2],
error: ""
}

Notice, status is set to “loading”. According to our React container component, our animated loader should be rendered:

{status === "loading" ? (
<Loader />
) : (
//...

The issue must be that our bundled React app is not being injected into index.html properly.

After several attempts, I found the solution.

Instead of having the HtmlWebpackPlugin automatically inject the bundled code (as defined in webpack.config.js), we are going to have it output the bundle.js file into the public folder like so:

var path = require('path');module.exports = {
entry: './index.js',
output: {
filename: 'bundle.js',
path: path.resolve(__dirname, 'public')
},
module: {
rules: [
{
test: /\.js$/,
exclude: /(node_modules)/,
use: {
loader: 'babel-loader',
options: {
presets: ['env', 'react']
}
}
}
]
}
}

Then, update index.html to manually inject this file (and remove our message):

<body>
<div id="app">
</div>
<script src="./bundle.js"></script>
</body>

Finally, we need Webpack to re-bundle our code so let’s run:

webpack

Now, we can run npm start and check the local host:

Our loader is now rendering which means our React/Redux application is up and running on our own Express server!

Before we move on, let’s add one more dependency that is really useful called Nodemon.

Nodemon will listen for changes in our code and will restart our Express server if changes are made to it.

Go ahead and install it:

npm install --save-dev nodemon

Then, let’s update the start script in package.json:

"scripts": {
"start": "nodemon ./bin/www"
},

We are all done with the initial setup of our Express server.

Explaining the Functionality of Express Code

By using the Express Generator, it got us up and running much faster. However, I don’t want to pretend like this code will automatically make sense. Let’s take a look at how Express is working.

The code looks overwhelming, but unlike a React component, you don’t have to be as concerned with every single detail of what is going on to start. I’m just going to highlight certain parts of Express code that is crucial for understanding why our React/Redux app is now being rendered on a local host.

app.js has quite a few imports. The one’s worthy of mention are the following:

var express = require('express');
var path = require('path');

We use require('...') in place of import ... from '...' on our server side code. It’s important here to mention that Node.js does not support ES6 syntax hence differences like this.

We are importing Express and path. Path is used injecting the path to a certain folder our file for us.

A new Express app is then initialized with var app = express(); .

There are quite a few app.use lines, however, this line is critical as it is hosting our entire public folder:

app.use(express.static(path.resolve(__dirname, 'public')));

In this one line, main.css, index.html, and bundle.js are hosted. We can see this when checking the Sources tab in dev tools:

The following code simply sets an error status and displays an error message when we go to a route that is not defined:

// catch 404 and forward to error handler
app.use(function(req, res, next) {
var err = new Error('Not Found');
err.status = 404;
next(err);
});
// error handler
app.use(function(err, req, res, next) {
// set locals, only providing error in development
res.locals.message = err.message;
res.locals.error = req.app.get('env') === 'development' ? err : {};
// render the error page
res.status(err.status || 500);
res.render('error');
});

Our Express app is then made exportable in the following line:

module.exports = app;//equivalent to 'export default app'

This Express app is imported in bin/www:

var app = require('../app');
var http = require('http');

We also import http so we can create an HTTP server:

var server = http.createServer(app);
HTTP Server

This server has to go own a certain port (what comes after localhost:):

var port = normalizePort(process.env.PORT || '3000');
app.set('port', port);

In the code above, the port is set to 3000.

The server then listens on the specified port:

server.listen(port);
console.log("Application available at http://localhost:3000/");

This is what makes our application available on a local host address.

I also added a log that will appear in command line to remind us of the local host address:

All of this to say, running the npm start script runs the .bin/www code which makes our application available via the local host on port 3000.

The run Node.js code through command line, you do node *insert file*. This is why our npm start command works.

If you want more information, you can check out the official Express documentation.

Code at This Point:

Available on GitHub.

With our Express server up and running, it’s time to move on to MongoDB setup.

Getting Started With MongoDB

Installing MongoDB

First off, install MongoDB Community Server.

The installation instructions are available here.

Make sure to follow all instructions up to Manually Configure a Windows Service for MongoDB Community Edition.

You should be able to start and stop the MongoDB service using the following commands:

net start MongoDB
net stop MongoDB

The only part of the official instructions that might be difficult is creating the mongod.cfg file on Windows if you don’t have admin rights. You can simply open Notepad as administrator, create the new file, and save it as specified.

Installing Robo 3T

Robo 3T (formerly Robomongo) is a GUI for managing MongoDB databases instead of through command line.

Download it here.

Once this easy installation is complete, Robo 3T should run with an open prompt titled MongoDB Connections.

We have not created a MongoDB connection so go ahead and click Create.

Let’s call the connection Databases and leave the port to its default:

Hit Save and then Connect.

We can now access all our MongoDB databases:

There’s nothing important yet as we have yet to create a database. Let’s do that next.

Creating a Database

We are going to create a MongoDB database and run some CRUD operations from command line.

If you want to follow along with the official documentation for CRUD operations, you can do so here.

First, runnet start MongoDB to start the MongoDB service.

After that, run the following in command line to fire up a MongoDB server:

"C:\Program Files\MongoDB\Server\3.4\bin\mongod.exe"

Note: you may need to run this as an administrator.

Then, go to Robo 3T, right-click on Databases, click Create Database, and name it TestDatabase.

Next, we can right-click on TestDatabase and click Open Shell:

We can now run queries on our database:

We are going to write queries for CRUD operations.

A quick note before we do that. MongoDB is a NoSQL database. Meaning, it’s going to represent data in JSON format using name and value pairs:

"Object" : {
"Name": "Value"
}

This is exactly like Firebase. There’s some slightly different terminology if you’ve looked at the official documentation.

An outermost object can be called a collection. Within a collection are documents which with field and value pairs:

The example shown above creates a collection called users with a document describing information about a user in field: value pairs.

Create

Also, we can see that the syntax for a create operation of a single document within a collection is as follows:

db.collection.insertOne({ *insert document* })

We can also do multiple collections like so:

db.collection.insertMany({ *insert many documents* })//Example
db.products.insertMany( [
{ _id: 13, item: "envelopes", qty: 60 },
{ _id: 13, item: "stamps", qty: 110 },
{ _id: 14, item: "packing tape", qty: 38 }
] )

If the collection specified is not already existing, it will be created automatically.

Let’s create a collection called Products and insert many documents describing an item, quantity, and price of a grocery item:

db.products.insertMany( [
{ item: "applesauce", qty: 2, price: 1.08},
{ item: "bacon", qty: 1, price: 5.52 },
{ item: "eggs", qty: 2, price: 1.68}
] )

Copy and paste this into the shell then run it by clicking the play symbol.

We should now see a collection called products. Double-click it and we can see that all of our documents have been inserted into this collection. The document is automatically given and id:

We can also see the types of our values for a particular key.

Read

At the top, we should see db.getCollection(‘products’).find({}) in the shell. This is how we can read data from a MongoDB database.

We specify the collection and we can insert documents to find like so:

db.getCollection('products').find(
{ item: "applesauce", qty: 2, price: 1.08}
)

Copy this into the shell and run it:

We can see that is has returned the correct document from the collection we specified.

What if we wanted to find every document in a collection where there is a quantity of 2 specified?

We can run the following command in the shell:

db.getCollection('products').find(
{ qty: 2 }
)

This will return the documents for our applesauce and eggs as expected:

Let’s try running a query that will return every document if it has a quantity of 2. We can run the following:

db.products.find( { qty: { $eq: 2} } )

Notice that we don’t need the getCollection().

The $eq is one of many comparison operators. You can view the full list here.

I will move onto updating and deleting, but if you want to test out all reading possibilities than the official documentation is very helpful and straightforward.

Update

We have three options for running update queries:

Update is for updating the value(s) of key(s). Replace will replace the entire document.

For each query, we specify a filter and then the update action. For example:

$set is used when specifying the new field (key) and value pair.

Let’s try this example:

db.products.updateMany( 
{ qty: {$eq: 2} },
{ $set: {qty: 4} }
)

This should update every product with a quantity of 2 to a quantity of 4.

Run it and then click on products from the left-hand menu. We should see that it worked:

You can test the other two update queries if you’d like.

Delete

Let’s learn how to run queries for our final operation, delete.

This works just like a read operation except it will delete the document(s).

Let’s try deleting all products with a quantity of 1:

db.products.deleteOne( { qty: {$eq: 1} } ) 

If you click products after you run it, we can see it worked:

Let’s view these result in text format:

We can now see that our collection looks like in JSON format:

We have tested out creating a MongoDB database and running basic CRUD operations.

Next, we want to be able to run these queries through a server-side API.

Implementing Our API Service

Creating Our Model

Let’s start by installing a tool called Mongoose:

npm install mongoose --save

Note: If you open up a new command line tab and install this dependency (so our server doesn’t crash), you can see Nodemon restarting the server after our installation:

Pretty sweet!

Mongoose provides a means to work with MongoDB via code (essentially).

In our project, create a new folder called models. In this folder, create a file called StreamModel.js.

In this file, we are going to start by creating a Mongoose schema.

A schema is used to define what a collection in a MongoDB database will look like. Here’s an example:

var blogSchema = new Schema({
title: String,
author: String,
body: String,
comments: [{ body: String, date: Date }],
date: { type: Date, default: Date.now },
hidden: Boolean,
meta: {
votes: Number,
favs: Number
}
});

You can see that a schema is a JavaScript object. Within this object, we specify the fields for documents within a collection and their value types.

We didn’t cover this before, however, we can nest/embed multiple documents (i.e. the meta and date documents).

We can also embed arrays in documents (i.e. comments). These things are also specified in the schema.

We can add the following Schema in StreamModel.js:

var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var streamSchema = new Schema({
image: String,
name: String,
viewers: Number
});

We are going to keep our database for our project very simple. We will have just this one collection and three fields which we can use to render multiple stream cards to the user interface.

After we create our schema, we can create a model. A model just takes the name of a collection and the scheme we want to apply to it.

We can then work with the model to configure our API service.

Let’s add the define the following model after our schema:

var Streams = mongoose.model('Streams', streamSchema);

We will have a collection called Streams and we want it to be shaped as specified in our schema.

To wrap up StreamModel.js, let’s make it exportable:

module.exports = Streams;

Creating the Database

Let’s go ahead and create a database for our project in Robo 3T.

Right-click Databases and select CreateDatabase. We will name it the same as our project, ReactReduxExpressMongo.

That’s it for now!

Creating the API in Express

Let’s open app.js and begin writing an API right here:

First, let’s require Mongoose and establish a connection with our database:

//API Service
var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/ReactReduxExpressMongo');

/ReactReduxExpressMongo is added given the name our database.

We can tweak this slightly to log a successful or unsuccessful connection message:

mongoose.connect('mongodb://localhost/ReactReduxExpressMongo')
.then(() => console.log('connection succesful'))
.catch((err) => console.error(err));

Next, we can require the Streams model:

var Streams = require('./models/StreamModel.js');

We can then add a response to an HTTP post request on the following http://localhost:3000/ path:

app.post('/', function(req, res){
res.json({message: "Hello First API Call!"});
});

We want to simply return a JSON object with a key of message and value of Hello First API Call!.

Download a tool called Postman in order to test out our first API.

We insert the POST request URL and hit send:

Perfect! We can see that we got the JSON object response.

Now, let’s try setting this response on a different route.

Back in app.js, let’s import router and define a /api route:

var router = express.Router();
app.use('/api', router);

Next, update the post to occur on this route:

app.post('/api', function(req, res){
res.json({message: "Message from /api"});
})

Notice that I also updated the message.

Update the request URL in Postman and let’s test this:

Right on!

We will need to add more routes, so let’s test out a /api/streams route and respond with two separate messages on our two api routes:

var router = express.Router();//create api route
app.use('/api', router); //api root
app.use('/api/streams', router); //streams collection
app.post('/api', function(req, res) {
res.json({ message: 'API Root!' });
});
app.post('/api/streams', function(req, res) {
res.json({ message: 'Streams route created!' });
});

We can test and verify the /api/streams route response is working via Postman:

The next step is to have our routes specify interactions with the MongoDB database depending on the HTTP action. Here are the different routes, their HTTP actions, and the intended interaction with MongoDB:

Let’s try to have our /api/streams route getting the correct interactions with MongoDB as shown above.

To create a new stream document in our Streams collection, let’s update the code for a POST action on the /app/streams route:

app.post('/api/streams', function(req, res) {
var stream = new Streams();
stream.image = "http://bit.ly/2sCaKaY";
stream.name = "Test_Stream";
stream.viewers = 30;
stream.save(function(err) {
if (err) {
res.send(err);
}
res.json({ message: 'Stream created. Check Robo 3T!' });
});
});

In the code above, we create a new instance of the Streams model, specify the three fields that match the shape of a document as defined in our schema, and print out a success message (or error message if it fails).

Let’s run another test via Postman:

Let’s follow the instructions of this message and check Robo 3T.

Right-click on the ReactReduxExpressMongo database and hit Refresh.

We now have a new collection called streams with a document as specified in our API:

The _v just specifies the version, don’t fret!

Cool beans! We have interacted with MongoDB via our own API.

Let’s try to get this document by adding the following:

//get all streams
app.get('/api/streams', function(req, res) {
Streams.find(function(err, streams) {
if (err) {
res.send(err);
}
res.json(streams);
});
});

Test this in Postman. Make sure to switch to GET request:

Woot woot! We can see data from our database!

Rendering Our Data With Redux and React

As we previously mentioned, we have been trying to replace established API that we used with the Twitch API with our own API:

Because our GET and POST actions are working on ./api/streams, we are ready to retrieve the data via an API request and use the data to render something to the user interface.

We will use axios within an action creator called RequestApi, just as we need before.

First, let’s install axios:

npm --save install axios

Then, add a file within the actions folder called RequestApi.js.

import axios from 'axios';
import FetchRequest from './FetchRequest';
import FetchSuccess from './FetchSuccess';
import FetchFailure from './FetchFailure';
//refresh readers on actions from ReactReduxTwitch
//update code below for our API
function RequestApi() {
return (dispatch) => {
//API request
axios.get('http://localhost:3000/api/streams')
.then(response => {
console.log(response);
//dispatch FetchSuccess, order 2
//dispatch(FetchSuccess(streams))
})
.catch(e => {
//dispatch FetchFailure, order 3
dispatch(FetchFailure(e))
});
//dispatch FetchRequest, order 1
dispatch(FetchRequest())
}
}
export default RequestApi

In the code above, we are doing a GET request on the API we just defined. We are using actions to update the status of application which is defined in our reducer with exception to dispatch(FetchSuccess()). We will uncomment this after we see the response that is being logged. This code is almost identical to how we handled the Twitch API in Chapter 4.

In our reducer (found in ApiApp.js), let’s update the initial state so the properties update dynamically via our dispatched actions:

//define the initial state
const initialState = {
status: "",
streams: [],
error: ""
}

Next, we need to import and dispatch the RequestApi() action from our React container component called Streams.js:

import RequestApi from '../../actions/RequestApi';//Provider/Container React Component
class Streams extends React.Component {
componentWillMount () {
this.props.store.subscribe(this.forceUpdate.bind(this));
this.props.store.dispatch(RequestApi());
}
//....

Let’s see if this is working. Before we check the local host, we will have to re-bundle our application by running:

webpack

Note: You may be able to configure hot (don’t have to type it manually) re-bundling using webpack-hot-middleware but that’s outside our scope.

Let’s now check the local host and the console logs:

Awesome! We can see that our FETCH_REQUEST action correctly update the status to loading. We can also see that our GET request returned some data in array. Let’s see what this is.

As expected, it’s the data from our MongoDB database!

Let’s update RequestApi.js so that we can extract this object:

axios.get('http://localhost:3000/api/streams')
.then(response => {
console.log(response);
const streams = response.data.map(function(stream) {
return stream;
});
//dispatch FetchSuccess, order 2
dispatch(FetchSuccess(streams))
})

In the code above, we map through every stream object from our response and pass it to the FetchSuccess action creator.

Next, open FetchSuccess.js and let’s make sure this action creator is adding the streams array into the action definition:

//define action within an action creator
function FetchSuccess(streams) {
const FETCH_SUCCESS = 'FETCH_SUCCESS'
return {
type: FETCH_SUCCESS,
status: "success",
streams
}
}
export default FetchSuccess

Finally, we can tweak how our reducer handles this in ApiApp.js:

case 'FETCH_SUCCESS':
const successful = Object.assign({}, state, {
status: action.status,
streams: action.streams
})
return successful

Now, our FETCH_SUCCESS action will update the initial streams property with the most recent extraction of the streams from the GET request.

Let’s make sure this is working by running webpack and then checking the local host:

In the console, we can see that the FETCH_SUCCESS action did add our stream object returned from the API request to the state!

The final step to have this all working is to update our React components so we can have a UI card for this stream.

In our container component found in Streams.js, we need to pass down the contents of the streams array in our state down to the presentational component.

Let’s update the rendering of multiple StreamCard components so that it passes down the properties for our streams array as props:

const streamCardItems = stateProps.streams.map((stream) =>
<StreamCard
key = { stream._id }
image = { stream.image }
name = { stream.name }
viewers = { stream.viewers }
/>
);

Next, open StreamCard.js and update it to the following:

import React from 'react';//Presentational React Component
class StreamCard extends React.Component {
render() {
return (
<div className="stream-cards">
<img
className="stream-cover"
src={this.props.image}
/>
<h4 className="stream-name">Stream: {this.props.name}</h4>
<h4 className="stream-viewers">Viewers: {this.props.viewers}</h4>
</div>
)
}
}
export default StreamCard

We are using the props to render information about a stream.

Also, update the .stream-cover styling in main.css to the following:

.stream-cover, .stream-name, .stream-viewers {
position: relative;
margin: 20px;
}

Run webpack and check the local host:

Everything is rendering correctly!

Note: Twitch channel stream previews were used as the image values in our database. Since the streams weren’t live by the time I got to this section, the images is the Twitch logo with a camera.

Let’s manually add another stream document to our database and see if it is rendering multiple streams as we expect it to.

Open Robo 3T and run the following in the shell:

db.streams.insertMany( [
{ "viewers" : 40,
"name" : "Test_Stream_2",
"image" : "http://bit.ly/2sCaKaY"
},
{ "viewers" : 50,
"name" : "Test_Stream_3",
"image" : "http://bit.ly/2sCaKaY"
},
{ "viewers" : 60,
"name" : "Test_Stream_4",
"image" : "http://bit.ly/2sCaKaY"
}
] )

Refresh the local host and we can now see all of our data loading:

Woohoo!

Making Our POST Request Dynamic

There’s a lot that we can do with our API, however, I’m just going to stick to the basics and you can explore on your own.

To finish off the API for the scope of this project, we will make our POST request.

Recall, it current just adds a stream document with predefined information:

//post a new stream
app.post('/api/streams', function(req, res) {
var stream = new Streams();
stream.image = "http://bit.ly/2sCaKaY";
stream.name = "Test_Stream";
stream.viewers = 30;
stream.save(function(err) {
if (err) {
res.send(err);
}
res.json({ message: 'Stream created. Check Robo 3T!' });
});
});

We will just be testing this with Postman and won’t make any changes to our React/Redux code for the sake of time.

In order to make this request dynamic, we will use URL parameters in the request that defines what information should be used for the new document.

We can handle these parameters by updating our POST API request in app.js like so:

//post a new stream
app.post('/api/streams', function(req, res) {
var stream = new Streams();
stream.image = req.param('image');
stream.name = req.param('name');
stream.viewers = req.param('viewers');
stream.save(function(err) {
if (err) {
res.send(err);
}
res.json({ message: 'Stream created. Check Robo 3T!' });
});
});

We use req.param('insert key') as the information for creating the new documents.

Now, open Postman, click Params, and match the following (click to expand):

Here’s the link if you just want to copy: http://bit.ly/2sCaKaY

Send the request and check the local host:

We can see that this worked perfectly!

The request made an additional document in MongoDB:

Note: You have to refresh the database if you want to check for yourself in Robo 3T.

In your React/Redux code, an example use case would be to create a form and use an event handler to make a POST request with parameters using the information from the submitted form.

Again, I will leave that up to your own exploration.

Final Code

Available on GitHub.

Concluding Thoughts

My hope was to deliver a chapter that helps make the overwhelming feeling of full-stack development using React, Redux, Express, and MongoDB underwhelming and achievable. While we didn’t do everything under the sun with API interactions, you are just a step away and should possess the knowledge to handle exploring on your own. 😸 😍 💪 😜

Chapter 8

Chapter 8 is now available.

Buy the Official Ebook

If you would like to support the author and receive a PDF, EPUB, and/or MOBI copy of the book, please purchase the official ebook.

Cheers,
Mike Mangialardi

--

--