Building a web application using Rust


Dark View

Go from scratch to a simple CRUD application using Rust

Rust has evolved into a very useful language in many different domains and offers a surprising amount of benefit to even higher level crud applications in it's being simple and straight forward while providing excellent compile time analyzation of your code.

I don't assume you have any knowledge of rust or it installed on your system - but you should absolutely take learning rust seriously and spend some time really getting to understand it's approach to memory management to really get the most out of the language itself as well as learning a new approach to how a language can operate.

We're going to be using an ubuntu machine and installing postgresql, rustc and cargo to begin.


Install Rust And PostgreSQL on Ubuntu

Make sure your system is up to snuff with proper build-essentials


sudo apt update
sudo apt upgrade
sudo apt install build-essential
                    

Using rustup we can download and install the latest cargo (rust package manager) and rustc compiler easily.

curl https://sh.rustup.rs -sSf | sh

Let's install and configure postgresql to run as our database.

`sudo apt-get install postgresql-14
Let's login to our new postgres account and create ourselves an application user and password to use in our soon to be rust application!

Setting up a user, database and table with permissions in PostgreSQL


sudo su - postgres
createuser rust_is_easy --pwprompt
                    

Set the [password] as prompted and remember it! Let's add a database to our new postgresql server to power our app!

createdb rusty_data_sets

now let's use the `psql` command to access our local postgresql server and run commands

psql

We're connected with our command line client to the postgresql server! We can try some commands to look around and see what's going on

\l
Will list out our databases that are available for you to connect
\conninfo
will list your connection information

Let's create ourselves a simple SQL table before we get started!


CREATE TABLE IF NOT EXISTS person (  
    key uuid,
    name varchar(256),
    age int,
    emails text[],
    country text,
    created bigint,
    updated bigint
);
                     

Woa. We have arrays, bigints, regular ints, text, varchar and a uuid stored in a table? No problem, rust has strict typing, but handles all of our use cases nearly transparently.

Now lets go ahead and grant privileges for our application user to our database

GRANT ALL PRIVILEGES ON DATABASE rusty_data_sets TO rust_is_easy;

Create your Rust project

Open a new terminal! and let's navigate to a folder we want our rust projects to all live under. When we get to our /home/user/dev folder (or what ever you choose!), we're going to use Cargo to create our new project on our behalf.

cargo init rusty_persons

Navigate into our new rusty_persons directory and have a look around using the long ls command


cd rusty_persons
ls -l

We can run this now to see the default hello world out put and verify our setup is working before we get any further.

cargo run

Creating a dynamic web service in Rust

Great! Let's go ahead and add some dependencies that we'll use to start to listen on an http port and serve some dynamic html templates for our users.

Tide is a library to help us with the boiler plate of operating as a web server. Tokio enables us to do async programming in rust much more easily. Askama is a great template library based on Jinja to help us template dynamic HTML to the clients.

Add the following to your projects Cargo.toml file!

                    
[dependencies]
tide = "*"
tokio = { version = "*", features = ["full"] }
askama = "*"

Now if you run `cargo run` you'll find your commandline start doing a lot more than before as it grabs and compiles your new packages you've included.

The first thing we want to do is create a new directory for our Askama templates in the root level of our repository

mkdir templates

Not lets add a quick layout file so we can define the repetitive HTML once. Pay close attention to the `{% block content %}{% endblock %}` portion that is where the content of our actual pages will end up being templated into!

touch templates/layout.html
                    
<!doctype html>

<html lang="en">

<head>
  <meta charset="utf-8">
</head>

<body>
  <div id="content">
    {% block content %}{% endblock %}
  </div> 
</body>

</html>

Very simple and straightforward so far. Let's create a file for our home pages content next.

touch templates/home.html

Let's add a quick group of elements with some dynamic text we can update at the time of request from our users.

Check out the `{% extends "layout.html" %}` and `{% block content %}`, this is where we tell our home template who he belongs to with the `extends` keyword. and we can decide what should be included in the layouts content block using our `{% block content %}` and {% endblock %}tags.


{% extends "layout.html" %}

{% block content %}
<div>
  <h1>Hey look it's me!</h1>
  <h2>{{greeting}}</h2>
</div>
{% endblock %}

What is {{greeting}} ??? We are shoving a variable in the template that we haven't defined? Askama uses strongly typed rust structs you define to help us supply any variable we need! (and functions too - for another day!)

Let's go update our main.rs file now, let's read the whole thing and then we can break it down.


use askama::Template;
use tide::{Request, http::mime};

pub async fn home(req: Request<State>) -> tide::Result {
    let home = HomeTemplate::new("Hello!".to_string());
    Ok(tide::Response::builder(tide::StatusCode::Ok)
        .content_type(mime::HTML)
        .body(home.render_string())
        .build())
}

#[derive(Template)]
#[template(path = "home.html")]
pub struct HomeTemplate {
    greeting: String,
}

impl HomeTemplate {
    pub fn new(greeting: String) -> Self {
        return Self { greeting };
    }

    pub fn render_string(&self) -> String {
        return self.render().unwrap();
    }
}

#[derive(Clone, Debug)]
    pub struct State {
}

#[tokio::main]
async fn main() -> tide::Result<()> {
    let state = State {};

    let mut app = tide::with_state(state);
    app.at("/").get(home);
    app.listen("127.0.0.1:8080").await.expect("Listening works");
    Ok(())
}

That's a lot! Don't worry let's work backwards from the main function and see how straightforward and explicit this is.


#[derive(Clone, Debug)]
    pub struct State {
}
#[tokio::main]
async fn main() -> tide::Result<()> {
    let state = State {};

    let mut app = tide::with_state(state);
    app.at("/").get(home);
    app.listen("127.0.0.1:8080").await.expect("Listening works");
    Ok(())
}

Using #[tokio::main] we tell our compiler we're using Tokios main that allows us to say that main is an `async` function.

`let state = State {};` will create an empty struct for the server state, later we're going to add some database connection pools and all the shared service stuff we need.

`let mut app = tide::with_state(state);` is the magic that uses tide to create an app server for us, we can use this to attach middleware and define our routes

`app.at("/").get(home)` is how we map the url "/" to our handler function "home", we can use `:some_var` as well if we need to pass some data in via the URL, but for now we just want to get the `/` route to call our home function.

`app.listen("127.0.0.1:8080").await.expect("Listening works")` is how we tell our tide app instance where to listen from. What's Await? Expect? Don't sweat it - `.await` is a way for us to indicate to tokio that we're awaiting on the results from an async function (listen in this case is async). Expect is the lazy mans panic, listen is going to return us a result with the possibility of throwing an error, in production we'll handle this to perform some logging before we fallover, but for now we don't care and so we'll just panic with our custom message using `expect()`.


pub async fn home(req: Request<State>) -> tide::Result {
    let home = HomeTemplate::new("Hello!".to_string());
    Ok(tide::Response::builder(tide::StatusCode::Ok)
        .content_type(mime::HTML)
        .body(home.render_string())
        .build())
}
     

our handler function in all of it's glory!. Two lines of code! Of course nothing is that simple, we are creating a HomeTemplate object and passing it a String "Hello!" - And then we're calling Ok() and passing in a tide::Reponse ?. Let's start backwards

Our tide::Reponse::builder is letting us create an HTML response with an OK status 200 code, and the body we're stuffing in the return from our home template objects `render_string` method.

We can look at our HomeTemplate struct to get the full picture


#[derive(Template)]
#[template(path = "home.html")]
pub struct HomeTemplate {
    greeting: String,
}

impl HomeTemplate {
    pub fn new(greeting: String) -> Self {
        return Self { greeting };
    }

    pub fn render_string(&self) -> String {
        return self.render().unwrap();
    }
}
     

We see our template(path="home.html") is pointing to our template file (default path looks in a templates/ directory), and we can see us using Derive for the Askama Template type.

In the `Impl` block we first define a constructor - our `new` function - and the we see our render_string function that is simply calling render and returning the resulting string to be used in the Tide responses body.

Hey!

Our HomeTemplate struct has our `greeting` field that we were able to access from within our home.html template file. Great news!

Using a Postgresql connection in a handler using Rust

We have a little web app running and serving us html that we can control with a little handler function. Next we'll want to connect to a database and start using results from that in our HTML templates to really get anything done.

SQLx to the rescue! If you're using vscode with the rust analyzer plugins this will give you compile time sql syntax checking using their query!() (Provided we use the DATABASE_URL environment variable with dotenv (it's easy - we'll do it in a moment)), very neat little productivity boost.

Let's add SQLx to our cargo.toml, and while we're there let's add a UUID library and a library to help us marshal and unmarshal JSON - Serde.

Oh wait! Let's add support for .env files! a great way to manage environment variables by using a simple local file.


sqlx = { version="*",  features = [ "postgres", "runtime-tokio-native-tls", "uuid"] } 
serde = {version= "*"}
serde_json = {version= "*"}
uuid = {version="*", features = [
    "v4",                # Lets you generate random UUIDs
    "fast-rng",          # Use a faster (but still sufficiently random) RNG
    "serde",
]}
dotenv="*"

Once you've added this to your cargo.toml - your cargo build or cargo run command will do some thinking in order to get you those packages.

Let's go ahead and create a .env file with a defined connection string to use for our database connection code we'll write in a moment.

touch .env

Let's add the following into that .env file. I'm using the rust_is_easy user, a dummy password and the rusty_data_sets database - but you should configure these to what ever you setup in the beginning of all this in postgres.

DATABASE_URL=postgres://rust_is_easy:[password]@localhost:5432/rusty_data_sets

Wew, alright, let's get some code added to our main to parse our DATABASE_URL and get our handler function a connection to our database!


dotenv::dotenv().ok();
let db_url = std::env::var("DATABASE_URL")
    .expect("Missing `DATABASE_URL` env variable, needed for running the server");
let db_pool: PgPool = Pool::connect(&db_url).await.unwrap();
let state = State {db_pool};

dotenv::dotenv().ok() will load our .env file into environment variables on our behalf. We can use ` std::env::var` to pluck values from the environment, which allows us to use our db_url variable to connect a new database pool to postgres!

We are passing in db_pool to our state now, so let's update that struct


#[derive(Clone, Debug)]
pub struct State {
    db_pool: PgPool,
}

Let's add the missing import statements up top if your editor didnt' already do that for you.

use askama::Template;
use sqlx::{PgPool, Pool};
use tide::{Request, http::mime};

Not so fast! We need to actually do something with our new database connection pool. Let's add a POST and GET to send and receive some JSON blobs. Let's add a couple new routes.


app.at("/person").post(insert_person);
app.at("/person/:person_id").get(get_person);

Posting JSON and storing it in postgresql using Rust

Let's start with our POST request route - importing serde helps us automatically parse our json into a type - so let's define that type now.


#[derive(Debug, Deserialize, Serialize)]
pub struct Person {
    pub key: uuid::Uuid,
    pub name: String,
    pub person_type: String,
    pub age: i32,
    pub emails: Vec<String>,
    pub country: String,
    pub created: i64,
    pub updated: i64,
}

Structs are pretty straight forward in rust, pub is short for public and it's simply Name: Type, to list fields in your struct.

And - `derive`. We are using rust afterall, we use derive to along with some existing macros to "inherit" some features. In this case the ability to be Serialized and Deserialized to and fro JSON by serde.

We'll need the handler function, which will use the Match functionality in Rust to check if a json body was successfully parsed and response accordingly using the Tide Responses as we did before on our Home handler.


pub async fn insert_person(mut req: Request<State>) -> tide::Result {
    // body_json() from the serde library returns us an async we await, and it returns a "Result" Type
    // We can use a match statement on a Result to handle the Ok and the Err possibilitys!
    let umd: Result<Person, tide::Error> = req.body_json().await;
    
    match umd {
        // body_json didn't throw an error! We're safe!
        Ok(person) => {
            
            // We're going to acquire a connection from our pool! when we put `mut` that allows us to modify our variables 
            let mut conn = match req.state().db_pool.acquire().await.expect("Get connection");
            
            // Let's query! Insert statement using $ followed by our newlyparsed persons fields as arguments 
            // the query! macro is what provides compile time SQL checks
            sqlx::query!("INSERT INTO person (key,  name, age, emails, country, created, updated) values($1, $2, $3, $4, $5, $6, $7)",
                    person.key,
                    person.name,
                    person.age,
                    &person.emails,
                    person.country,
                    person.created,
                    person.updated,
                )    
                .execute(conn.as_mut())
                .await
                .expect("Insert Success");

            // using serde we can to_string an object of our Person type and store it as a string variable!
            // let's write it out using our JSON mime type to the client
            let j = serde_json::to_string(&person).expect("To JSON");
            Ok(tide::Response::builder(tide::StatusCode::Ok)
                    .content_type(mime::JSON)
                    .body(j)
                    .build())
        }
        
        // There's an error parsing our JSON body! Bad Request!
        Err(e) => {
            println!("{:?}", e);
            Ok(tide::Response::builder(tide::StatusCode::BadRequest)
                .content_type(mime::JSON)
                .body("{'error': 'invalid json body'}")
                .build())
        }
    }
}

We are able to use the sqlx::query! macro to get compiler time checks over our sql, and using serde makes the serialization part easy. It is highly recommended for anyone new to rust to dive into the match syntax, Enums, and what Options and Results are. They are all easy concepts but integral to Rust.

Getting JSON via postgresql from our rust web service

Let's add our handler function for the get route. We're going to use req.param to pluck a URL parameter we defined in the routing via the :person_id in the app.at("/person/:person_id").get(get_person);


pub async fn get_person(req: Request<State>) -> tide::Result {
    // here we match on our param, if we find it - we can use it as a string!
    match req.param("person_id") {
        Ok(key) => {
            let mut conn = req.state().db_pool.acquire().await.expect("Get connection"); 
            // Parsing a string into a Uuid type, this ensures we're dealling with a non null and valid UUID from here out.
            let person_uuid = uuid::Uuid::from_str(key).expect("Note uuid parse");
            
            // select and pass in our strongly type UUID
            let person_map = sqlx::query!("select key,  name, age, emails, country, created, updated from person where key=$1", person_uuid) 
            .fetch_one(conn.as_mut())
            .await
            .expect("Get record");
        // using the person map we can pluck out our selected columns as fields
        let person_obj = &Person {
                key: person_map.key.expect("key exists"), 
                name: person_map.name.expect("name exists"),
                age: person_map.age.expect("age exists"),
                emails: person_map.emails.expect("emails exists"),
                country: person_map.country.expect("country exists"),
                created: person_map.created.expect("created exists"),
                updated: person_map.updated.expect("updated exists"),
            };
        // turn it to a string of json
        let j = serde_json::to_string(person_obj)
            .expect("To JSON");
        
        // write it out as mime type JSON!
        Ok(tide::Response::builder(tide::StatusCode::Ok)
                .content_type(mime::JSON)
                .body(j)
                .build())
        }
        // no param found, a bad request
        Err(e) => {
            println!("{:?}", e);
            Ok(tide::Response::builder(tide::StatusCode::BadRequest)
                .content_type(mime::JSON)
                .body("{'error': 'invalid json body'}")
                .build())
        }
    }
}

That's it! You have a dynamic home page and a couple JSON API routes. We think it's important to make the technology we use more efficient and deliberate at all parts of the stack.

We love working on Rust, but we also enjoy working in golang, python, C#, you name it! Reach out with your next project or problem and get a quick quote