I’ve been a developer for about 5 years now, but this is my first time actually writing about my work, so bear with me. Brainbox Digital is a widget platform that delivers personalized ads to users on various websites. They do this through widgets that continuously deliver relevant products to the contents of the publisher’s web page.

Technologies used

The main technologies that we used to develop the application are ReactJS with Redux. For those of you who don’t know, Redux is a concept of storing data and communication within the client-side of an application.

Express is used to create a small NodeJS proxy server decoupling the client-side from the backend. We use this to make development easier, in this way we don’t need any backend code, the node server is proxying to the backend. This is the “cool”factor in the project. Other dev dependencies that we use are babel, gulp and webpack.

import fs from ‘fs’;

import morgan from ‘morgan’;

import express from ‘express’;

import proxyRequest from ‘./proxy-requests’;

import serverConfig from ‘./config/server’;

const { staticDirectories, logLevel } = serverConfig;

export default function startServer() {

const app = express();

app.use(morgan(logLevel));

staticDirectories.forEach((path) => {

app.use(‘/static’, express.static(path));

});

app.engine(‘html’, (filename, options, callback) => {

fs.readFile(filename, ‘utf8’, (err, str) => {

if (err) {

return callback(err);

}

callback(null, str);

});

});

app.set(‘views’, ‘./static’);

app.set(‘view engine’, ‘html’);

// API requests

app.all(‘/api/v1/*’, proxyRequest);

// Status check

app.all(‘/status’, proxyRequest);

// Serve index.html for all routes

app.all(‘/*’, (req, res) => res.render(‘index’));
return app;
}

Above we have a very basic NodeJS express server and below, in proxy-requests we use the http-proxy module to make requests to the backend.

proxy-requests.js
import httpProxy from ‘http-proxy’;
import serverConfig from ‘./config/server’;
const proxy = httpProxy.createProxyServer();
export default (req, res, next) => {
const { apiHostname, protocol, proxyConf } = serverConfig;
let [ host, port ] = apiHostname.split(‘:’);
if (!port) {
port = 80;
}
proxy.proxyRequest(req, res , {
target: { host, port, protocol },
secure: false,
changeOrigin: true
});
proxy.on(‘error’, (err) => {
console.error(`Could not connect to ‘${apiHostname}’ API server. Please try again…`);
// next();
});
};

Application structure

-components

-containers

-images

-resources

-sass

-services

-store

-widgets

-index.html

-index.js

This is the basic structure of the project.

In the components folder we store mostly the simple components of the application, in containers we store the main components of the application (pages), in the images folder we store the application assets, in the resources folder we have the configuration files for the app, in sass we have the application style sheets, in services folder we store the endpoints to the backend and the API calls, in the store folder we have the actions reducers and sagas for the application and in the widgets folder we have the widgets defined which will be injected to the publisher’s site.

Index.html is the main entry point of the application and in index.js we store the routes for the app. React Router is used to handle the routing for the application.

index.js
import ‘babel-polyfill’;
import React from “react”;
import ReactDOM from “react-dom”;
import { Provider } from “react-redux”;
import { Router, Route, IndexRoute } from “react-router”;
import {browserHistory as history} from ‘react-router’;
import rootSaga from ‘./store/sagas’;
import { store } from “./store/config/configureStore”;
import App from “./containers/app/App”;
import Home from “./containers/home/Home”;
import Pages from “./containers/pages/Pages”;
import Widgets from “./containers/widgets/Widgets”;
import CreateWidget from “./containers/widgets/CreateWidget”;
import NotFound from “./containers/misc/NotFound”;
store.runSaga(rootSaga);
ReactDOM.render(
<Provider store={store}>
<Router history={history}>
<Route path=”(/)” component={App}>
<IndexRoute component={Home}/>
<Route path=”pages” component={Pages}/>
<Route path=”widgets(/)” component={Widgets}>
<Route path=”create” component={CreateWidget}/>
<Route path=”(:id)” component={CreateWidget}/>
</Route>
</Route>
<Route path=”*” component={NotFound}/>
</Router>
</Provider>,
document.getElementById(“root”)
);

The store folder is where all the magic of Redux happens. It’s structured as follows:

-actions (intercepting dispatched actions)

In _subroutines.js we have some helper functions for creating the action types for the API calls, so instead of declaring each individual action type (REQUEST, SUCCESS, FAILURE), we call a function that generates these action types based on a base action type. The same thing happens for the actions as well.

_subroutines.js
import { store } from ‘./../../store/config/configureStore’;
export function dispatch(type, payload) {
store.dispatch(action(type, payload));
}
export function action(type, payload = {}) {
return { type, …payload };
}
export function createRequestTypes(base) {
return [‘REQUEST’, ‘SUCCESS’, ‘FAILURE’].reduce((acc, type) => {
acc[type] = `${base}_${type}`;
return acc;
}, {});
}
export function createRequestActions(base) {
return {
request: () => action(`${base}_REQUEST`),
success: (data, response) => action(`${base}_SUCCESS`, { data, response }),
failure: (data, error) => action(`${base}_FAILURE`, { data, error }),
};
}
widget.js
import { action, createRequestTypes, createRequestActions } from ‘./_subroutines’;
/** Delete widgets **/
export const DELETE_WIDGETS = ‘DELETE_WIDGETS’;
export const DELETE_WIDGETS_TYPES = createRequestTypes(DELETE_WIDGETS);
export const DELETE_WIDGETS_ACTIONS = createRequestActions(DELETE_WIDGETS);
export const deleteSelectedWidgets = (params) => action(DELETE_WIDGETS, { params });

-reducers ()

In reducers folder we listen to the dispatched actions and update the store with the parameters passed along.

widget.js
import { UPDATE_WIDGET } from ‘./../../actions/widgets’;
const initialState = {
 colors: {
 background: “#ffffff”,
 main: “#c71818”,
 text: “#ffffff”
 },
 widgetName: ‘’,
 widgetType: {}
};
export default (state = initialState, action = {}) => {
 const { type, …rest } = action;
 switch (type) {
 case UPDATE_WIDGET:
 return {
 …state,
 …rest
 };
 default:
 return state;
 }
};

– sagas

In sagas, we use redux-saga for requesting data from the backend. In redux-saga the UI components never invoke directly a task, instead they dispatch plain objects to notify that something happened. Sagas will watch for dispatched actions and fork the task whenever a specific action that the saga is listening to is dispatched.

index.js
import { call, fork, put } from ‘redux-saga/effects’;
import { watchLoadPages, watchDeletePages, watchUploadUrls } from ‘./pages’;
import { watchLoadWidgets, watchDeleteWidgets, watchSaveWidget } from ‘./widgets’;
export function* fetchEntity(entity, apiFn, id, url) {
yield put(entity.request(id));
const { response, error } = yield call(apiFn, url || id);
if (response) {
yield put(entity.success(id, response));
} else {
yield put(entity.failure(id, error));
}
}
export default function* root() {
yield [
fork(watchLoadPages),
fork(watchDeletePages),
fork(watchLoadWidgets),
fork(watchDeleteWidgets),
fork(watchUploadUrls),
fork(watchSaveWidget),
];
}
upload-urls.js
import { call, put, take } from ‘redux-saga/effects’;
import * as ActionTypes from ‘./../../actions’;
import { UploadUrl } from ‘./../../../services’;
import { fetchEntity } from ‘./../index’;
const apiFn = fetchEntity.bind(null, ActionTypes.UPLOAD_URLS_ACTIONS, UploadUrl);
export function* watchUploadUrls() {
while (true) { // eslint-disable-line
const { params } = yield take(ActionTypes.UPLOAD_URLS);
yield call(apiFn, params);
yield put(ActionTypes.action(ActionTypes.LOAD_PAGES));
yield put(ActionTypes.action(ActionTypes.DESELECT_PAGES));
}
}

As a frontend developer this was a very interesting project, as it coupled lots of technologies and it made me push myself and experiment a bit. Working without a “normal” backend code, using a Node.js proxy was especially fun, once I got the hang of it anyway.

This was originally published on medium.