Next.js Course: Building a database schema with Supabase
In this lesson, we learn how to create a database schema for Supabase and how to use the Supabase client to interact with the database.
In the previous sections, we have set up the foundations of our SaaS application and added a way for users to sign up and log in.
In this section, we will create a database schema for our application and learn how to use the Supabase Javascript client to interact with the database.
As we've mentioned before, the application we will be building allows users to generate blog posts using OpenAI's GPT-3 API.
Therefore, our schema will need to store the following information:
- users- posts
We will also need to store the Stripe subscriptions and user thresholds, but we will do that later on.
Users Table
As we have mentioned in the previous lesson, Supabase stores the users' information in a table called auth.users
. If we want to store additional data about the users, we can create a new table called users
and link it to the auth.users
table using a foreign key.
While not important for this application, this is a good practice to follow in general, as your application will likely need to store additional information about the users.
Initial Schema Setup
First of all, let's create a new migration file which we simply call schema
by running the following command:
./node_modules/.bin/supabase migration new schema
This will create a new migration file in the supabase/migrations
folder identified by a timestamp.
Users Table Schema
The first table we will create is the users
table. This table will store the following information:
id
- the user's id (the same as the user UUID in theauth.users
table)photo_url
- the user's profile photo (optional)display_name
- the user's display name (optional)created_at
- the date when the user signed up (automatically generated)
Open the migration file generated by the Supabase CLI and add the following code:
create table users ( id uuid references auth.users not null primary key, photo_url text, display_name text, created_at timestamptz not null default now());
Perfect, we have now created a new table called users
with the columns we need. In the future, feel free to add more columns to this table if you need to store additional information about the users.
Auth Triggers
To create a user table automatically when a user signs up, we will use a Postgres trigger function.
A trigger function is a function that is executed when a certain event occurs. In our case, we want to execute a function every time a new user is created in the auth.users
table.
Paste the code below in the migration file:
create function public.handle_new_user()returns triggerlanguage plpgsqlsecurity definer set search_path = publicas $$begin insert into public.users (id, display_name, photo_url) values (new.id, new.raw_user_meta_data ->> 'display_name', new.raw_user_meta_data ->> 'photo_url'); return new;end;$$;-- trigger the function every time a user is createdcreate trigger on_auth_user_created after insert on auth.users for each row execute procedure public.handle_new_user();
Thanks to this trigger, every time a new user is created, a new row will be added to the users
table with the user's information.
Row Level Security (RLS)
Postgres Row Level Security (RLS) is a feature that allows us to control access to rows in a database table based on the characteristics of the user performing a query or mutation.
It provides fine-grained control to ensure that users can only access the data they are authorized to see. This is especially useful in multi-tenant environments where you need to prevent users from accessing others' data.
The policies defined for RLS can include restrictions based on the contents of the rows themselves, or attributes of the user, further enhancing the security of your data.
Supabase allows us to use RLS out of the box, and with access to the user's properties (such as the user ID), we can easily define policies that prevent users from accessing data that does not belong to them.
I particularly like RLS because it allows us to skip writing defensive code in our codebase and shift the responsibility at the database level, allowing us to write a lot less imperative code: queries and mutations that do not respect the policies will be rejected.
First, we need to enable RLS on the users
table:
alter table users enable row level security;
Next, we create an RLS policy that prevents users from reading or writing data that does not belong to them:
create policy "Users can read and update data belonging only their records" onusers for all using (auth.uid () = users.id) with check (auth.uid () = users.id);
In the snippet above, we use both using
and with check
:
- We use
using
when reading from the table - We use
with check
when writing to the table
The policy above is created using the create policy
statement followed by the policy name and the table name.
The policy applies to all rows in the users
table, as specified by the for all
clause. The policy uses the using clause to specify a condition that must be true for the policy to be applied. In this case, the condition is that the authenticated user's ID must match the ID of the row in the "users" table. This ensures that users can only read and update data that belongs to them.
The with check
clause specifies a condition that must be true for a row to be read or updated. In this case, the condition is the same as the one specified in the using clause, which ensures that users can only read and update their own data.
You can choose to pick individual permissions for the policy, such as select
, insert
, update
or delete
. In our case, we want to apply the policy to all operations, so we use for all
.
Blog Posts schema
Next, we want to build the blog post schema.
The schema will contain the following properties:
id
- the autogenerated ID for the useruuid
- we use the UUID for referencing the blog post externallyuser_id
- we use a foreign key to link the blog post with the user that created ittitle
- the title of the post (cannot be null)content
- the content of the post (cannot be null)description
- the description of the postcreated_at
- the date when the post gets created (automatically generated)
create table posts ( id bigint generated always as identity primary key, uuid uuid not null unique default gen_random_uuid(), user_id uuid not null references public.users on delete cascade, title text not null, content text not null, description text, created_at timestamptz not null default now());
Blog Posts RLS
Just like for the users
table, we need to enable RLS on the posts
table and protect them from unauthorized access:
alter table posts enable row level security;create policy "Users can read and update only the posts belonging to them" onposts for all using (auth.uid () = posts.user_id) with check (auth.uid () = posts.user_id);
To protect the posts
table, we check that the ID of the user currently signed in is the same as the posts' user_id
property.
Perfect - we're almost there.
Resetting the local database
After you make changes to the database schema, you want to reset the local database so that it picks up the new changes.
To do so, you have two choices:
- Stop and restart the Supabase containers
- Run the
reset
command
In most cases, the second way is faster. You can run the following command:
npm run supabase:db:reset
To verify your setup, navigate to the Supabase Studio, where you will find the new tables. If the tables are all there, good!
If you are currently signed into the application, you will need to sign out and sign in again as the state of the DB has changed.
Generating the Types
Last but not least, we need to generate the types for the database schema. To do so, run the following command:
npm run typegen
This is a command we added earlier during the setup. This command generates the database.types.ts
file that we can use to add strong-typing to the Supabase client.
Conclusion
In this post, we have learned how to create a migration, some fundamentals around RLS and the schema for our small SaaS application.
This is not actually the end of it, because later on we will introduce another table for storing the Stripe subscriptions and users thresholds. But we don't have to worry about that now, we will do it later on.
What's Next?
Next, we will start implementing the ChatGPT API to generate posts using Forms and Next.js Server Actions.
It's going to get fun! 🚀