Giving TodoMVC the API It Deserves: Part 1 – Todo Creation

Todo MVC is a great way to compare different JavaScript frameworks and get a feel for how to build an app that’s slightly more complex than what can easily fit in a README.
In this series, we’ll start where Todo MVC left off, and build a backend for the app that allows syncing your todos across devices. We’ll take a look at the different possibilities inside one particular cloud provider, Azure, but the hope is that eventually others can add similar tutorials for other cloud providers.
Let’s get started!

The Frontend

We’ll start with a completed front end TodoMVC application. If you have a favorite framework already feel free to use the TodoMVC apps found on todomvc.com. We’ll use a slightly trimmed down version of the Vue.js version for our example. You can find the code here. Feel free to clone the repo down and run the app locally following the instructions in the README.
If you’re unfamiliar with Vue.js, you can learn more about it from the awesome resources on the official website. We won’t need to modify the app too much to make it work with our API so don’t worry if you’ve not used Vue before.

Creating an API

The very first thing we’ll do is create an extremely simple API for creating todos. To keep things simple, we’ll use the same language used in Todo MVC, JavaScript, for the backend. We have several options for how to build our API, all of which have their pros and cons. We could, for instance, use some framework like Express, run the app on a virtual machine (VM) that we have to manage ourselves. Or we could let Azure do a bit more work for us and run the API on Azure App Service which handles provisioning and even scaling our APP out when it sees lots of traffic.
But as we’re just starting out, we want to keep things simple – the less work we have to do the better. For this reason we’ll use Azure’s Functions as a Service (FaaS) offering called Azure Functions. With Functions we can write code for handling an HTTP request and deploy it to the cloud without worrying about how many servers we want to run, if they have the right dependencies installed, etc. We just write the code and the service will make sure it runs properly.
FaaS is also commonly known as serverless, but this name is a bit misleading. There are still servers, we just don’t have to worry about managing them; we just write code and the rest is taken care of for is.
We’ll start by creating a function for creating todos.

Creating a Functions App

The first thing we’ll need to do is to create a “Functions App". Various functions are always grouped together under an "app". At first, we’ll have one function (one for creating a todo), but eventually we’ll have many more. This allows for us group this functions into a logical group.
There’s lots of ways to create an Azure Function app and to populate this app with various functions including from the Azure Portal, VS Code and the command line. I’ve written about this before, if you want to learn more.
My preferred way for creating an Azure Function is by using the Azure Functions command line tool (which we can easily install on macOS with brew brew tap azure/functions; brew install azure-functions-core-tools and on Windows with npm npm i -g azure-functions-core-tools –unsafe-perm true).
To create a new Azure Functions app run the following and choose node as the runtime and JavaScript as the language:
$ func init

And then create an Azure Function by running the following and choosing HTTP trigger and naming the function CreateTodo.
$ func new

We now have our function living in a CreateTodo directory. We can run it locally by running the following command:
# make sure you have an LTS version of node like node 10.15.3
func start

If you visit the http://localhost:7071/api/CreateTodo?name=World in your browser, you should see "Hello World".

Storage

Before we can start writing our function, we’ll need to make a decision about where we want to store our data. Again, we want to choose something simple that allows us to efficiently read or write data without much fuss. While we could use a SQL database or some other more robust option, the best option for now would be a simple key/value store that allows us write data to it without having to define the structure beforehand.
For this, we’ll be choosing Azure Table Storage which like other key/value stores (e.g., Redis) provides us with a simple way to write data that will be partitioned on some key allowing for efficient lookup. In the case of a todo app where each user has one list of todos, the most straightforward answer is to use an identifier for the user as the key. In our case, since we don’t yet care about authentication, we’ll just require the user to provide us an email address which we’ll use as the key to partition the data. That way when we want to fetch all the user’s todos, we just need to provide Table Storage with their email address and it will be able to fetch the todos extremely quickly.
Alright, enough talking, let’s actually get this working.

Adding Table Storage Access to the Function

We first need to configure our function to be able to talk to Table Storage. This requires extending the Azure Functions runtime since it only supports HTTP request handling by default.
We can do this with the Azure Functions CLI tool. To set up the boiler plate around the extensions system run the following:
$ func extensions install

Among other things, this creates an extensions.csproj file. If you’ve never done dotnet development before, a csproj file is just a project definition file for a dotnet project written in xml. The functions runtime is written in dotnet, so it makes sense that in order to extend it we’ll need to edit a csproj file.
In the extensions.csproj file we need to add the following line to the ItemGroup list:

This indicates we want to use the storage extension which allows us to use Table Storage among other types of storage services like blobs and files.
We need to run func extensions install again to install any dependencies for this particular extension.

Configuring Our Function to Use Table Storage

Ok, now that we can use the storage extension in our functions app, we need to configure our CreateTodo function to actually use table storage.
In the CreateTodo directory, there is the function.json file which is a configuration for this particular function. It should contain the following:
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "res"
},
]
}

The bindings value lets our function know how its bound to the outside world. In this case, we’re letting the function know it will be triggered by an http POST request we’ll call req and it will return an http response we’ll call res. Our function can be triggered by other things like timers and events, but this one is configured for http.
We need to add an additional binding for Table Storage. Add the following to the "bindings" array:
{
"type": "table",
"tableName": "Todos",
"direction": "out",
"name": "todosTable",
"connection": "STORAGE_CONNECTION"
}

The type field indicates we’re creating a binding for a table, tableName indicates we’re interacting with a table named Todos (which we’ll create soon), direction indicates we’re writing out to the table and not reading in from it, name is the name we’ll use to refer to our table in the function’s code, and connection is the environment variable where we’ll store our info on how to connect to our storage account like the account name and account key.

Talking to Table Storage

Now that we’ve configured our function to use table storage we need to actually use it. In the CreateTodo directory, there is an index.js file which contains the code for our Function. Replace its contents with the follow:
module.exports = async function (context, req) {
// check for an email inside of the Authorization header
const email = req.headers["authorization"]
if (!req.body || !email) {
// there was no body or no Authorization header
// setting context.res will make our function return the specified response
context.res = {
status: 400,
body: "missing body or authorization header"
}
}

const todo = {
id: generateId(),
title: req.body.title
}

if (!validTodo(todo)) {
context.res = {
status: 400,
body: "invalid todo"
}
return
}

// the todo is valid
// write to the todosTable
context.bindings.todosTable = [{
PartitionKey: email,
RowKey: todo.id,
title: todo.title
}]

context.res = {
// the response is a 200 by default
body: todo
}
}

function validTodo(todo) {
return todo && todo.title
}

// https://stackoverflow.com/questions/105034/create-guid-uuid-in-javascript
function generateId() {
return ‘xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx’.replace(/[xy]/g, c => {
const r = Math.random() * 16 | 0
const v = c == ‘x’ ? r : (r & 0x3 | 0x8)
return v.toString(16)
})
}

Functions require that we export an async function that takes two arguments: a context object that represents the context of our function and a req object that represents the current http request.
Our function simply checks that we have an Authorization header (which we access using all lowercase letters and that we assume is an email address) and a request body. If not, we return a http 400 response.
We then create a todo object with a generated id and a title passed to us in the request body. We then check to make sure that the todo is valid (by checking that title is truthy).
Once we know we have a valid todo we can set context.bindings.todosTable to an array of objects that will be stored in our table. Remember that we set todosTable to be the name of our table binding in the functions.json file.
The object we put in the array assigned to todosTable has three keys:

PartitionKey is how our table will be partitioned. Table Storage can do very fast lookups on partitions, so it makes sense for us to partition by email so that when we want to look up the todo by email, Table Storage can quickly return us all todos associated with that email.

RowKey is a unique identifier within a partition. We’ll use the id we’ve generated. These ids are globally unique, which is even more unique than RowKeys need to be.

title is just the title of our todo. Table Storage requires us to provide PartitionKey and RowKey and we can add as many additional fields as we want. For now, we’re just storing the title.

Creating our Table Storage

In order for us to successfully run this Function, we need to actually create an Azure Storage account so we can use Table Storage. To do this, we’ll use the Azure CLI which you can install with the directions here.
First you’ll need to make sure you’re logged in to Azure which you can do by running az login.
Then we’ll create a resource group where all of our Todo Cloud resources will live. Resource groups are just a way of grouping resources so you can handle them all together. Run the following:
az group create \
–name TodoCloud \
–location eastus

Feel free to choose a location that’s closer to you. You can also see all locations by running az account list-locations –query "[].name".
Next we’ll create the account:
$ az storage account create \
–name todocloud \
–resource-group TodoCloud \
–location eastus \
–sku Standard_LRS \
–encryption blob

You can change –location to a location near to you. The rest is fine to leave the same.
We’re getting so close! We now have a storage account, and we can create our Todos table using the following command:
$ az storage table create –account-name todocloud –name Todos

We now have everything in place for our function to work. The last thing we need is to make sure the STORAGE_CONNECTION environment variable we’re using in our functions.json config file is properly populated. We can do this by setting it in the Values key found in the local.settings.json file of our Functions app. These values get automatically set as environment variables our Functions App can use.
In the Values array of your local.settings.json file add the following key and value:
"STORAGE_CONNECTION": "DefaultEndpointsProtocol=https;AccountName=todocloud;AccountKey=$SOME_LONG_KEY"

Make sure that you replace $SOME_LONG_KEY with a proper account key. You can find a storage account key with the following command:
$ az storage account keys list \
–account-name todocloud \
–resource-group TodoCloud
–query "[0].value"

With this all in place we should now be able to run our Function App locally again with the following command:
$ func start

To create a todo we can just use cURL:
curl -XPOST http://localhost:7071/api/CreateTodo -d ‘{"title": "Buy milk"}’ -i -H "Authorization: ryan.levick@example.com"

If you get a 200 back you can be sure that your todo has been stored in table storage.
You can use the Azure Storage Explorer app to view the data inside your table.

Change our TodoMVC app

In our client’s code in app.js, we’ll want to replace the addTodo method (which is what gets called when the user hits the enter key after typing a todo to create it).
Replace the existing body of addTodo with the following:
const value = this.newTodo && this.newTodo.trim()
if (!value) {
return
}
fetch(API + "/CreateTodo", {
headers: {
‘Accept’: ‘application/json’,
‘Content-Type’: ‘application/json’,
"Authorization": "ryan.levick@example.com"
},
method: "POST",
body: JSON.stringify({ title: value })
}).then(res => {
if (res.status < 500) {
this.todos.push({ id: this.todos.length + 1, title: value, completed: false })
this.newTodo = ”
}
})

This code is pretty naive and not the best user experience, but it does the trick. Just as before, we get the todo from this.newTodo and clean it up a bit. But not we make an http POST request to our the API we have running locally. If the response’s status is less than 500, we do what we did before and save the todo to our local state and reset the newTodo input field.
If we open up the index.html file at the route of our client, and add a todo, we’ll get a CORS error. To enable CORS calls for our app, we’ll have to make one last change. We need to add the a Host key to the JSON object in the local.settings.json file with the value {"CORS": "*"}.
If we restart our locally running function, and try to create a todo, it should work.
Hooray! 🎉

Conclusion

We’ve done a lot in this post, we created an Azure Function and hooked it up with a newly created Table living in a Storage Account on Azure.
In the next post we’ll create a "GetTodos" endpoint for fetching our todos, and we’ll deploy our function to Azure so we don’t need to run it locally.
I’d love to see others takes on how to create a backend for TodoMVC both on Azure and on other cloud providers. Feel free to contribute a PR to the repo, write a blog post about your experience or reach out to me on Twitter.

Link: https://dev.to/azure/giving-todo-mvc-the-api-it-deserves-part-1-todo-creation-19h4