Tag: GitHub

  • Lovable to Local with Supabase

    I have a client who has created a really neat app via Lovable. They needed help moving from what they have to launch.

    One of the first concerns I had was how to set up a development environment for the project outside of Lovable.

    tl;dr Creating a local environment for a Lovable app required cloning the GitHub repo and then setting up a local Supabase install by seeding data from the remote database.

    Note: This tutorial is for Lovable projects which already have Supabase and GitHub integrated with the app. If your Lovable project isn’t there yet, here is the documentation for integrating Supabase and GitHub.

    Before you get started

    There were a few gotchas which I ran into. I deal with them throughout the tutorial, so you can skip ahead if you like. If you want a heads up of what to watch out for, continue here.

    Self-hosted via Docker VS Supabase CLI

    This feels a bit like the struggle of self-hosted WordPress vs WordPress.com. Like WordPress, Supabase is both an OpenSource project which can be hosted on your servers via Docker. It is also an enterprise hosting service which will host the databases for you. The important distinction for our tutorial is that in order to work on a Supabase project locally, you do need to have Docker Desktop installed and configured to work but you do not need to go through the trouble of setting up the Docker environment. It will send you down a rabbit trail. Rather, install the Supabase CLI and initialize the project from there, as described in Steps 3-5.

    The Remote Database Password is Required during First Connections

    This may seem like a “well-duh”, but I tried my hardest to work around it. Mostly because my client wasn’t sure what the password was and I didn’t want to force them into resetting the password, as I wasn’t sure how that would affect the Lovable app. This password isn’t tied to a user account, but the database itself. I tried just downloading a copy of the back up, and then restoring it locally, but that turned out to be its own headache. The native tools built into the Supabase CLI make accessing the remote database much quicker if you have the password. My client and I ended up resetting the password. Luckily, changing the password didn’t seem to have an effect on the Lovable app and I didn’t have to hunt around for places where the password was used.

    Step 1: Install Supabase CLI

    If you already have Supabase CLI installed, then you can skip ahead.

    If not, the documentation for installing the CLI can be found here: https://supabase.com/docs/guides/local-development/cli/getting-started

    Click on the environment you are using to install the CLI and it will provide the instructions.

    In my case (macOS), I used Homebrew via command line in Terminal.

    brew install supabase/tap/supabase
    

    Step 2: Clone the Lovable GitHub repo

    $ git clone <Your Repo URL>
    

    Step 3: Initialize Local Supabase

    Access the folder where you cloned the Lovable project from GitHub. There should already be a supabase directory. It may contain sub-directories like functions and migrations.

    % cd <Your Repo Name>
    

    You probably don’t want to commit any of the local configuration files to the repo which you share with Lovable. Add the following items to your .gitignore file:

    # Ignore Supabase local Development files
    supabase/seed.sql
    supabase/.branches/
    supabase/.temp/
    supabase/functions/.env
    supabase/config.toml
    

    Important: Make sure to save and commit the change to the .gitignore file before running supabase init. This way, Git won’t be trying to track the files Supabase will create.

    Next, run the Supabase initialization:

    <Your Repo Name>% supabase init
    

    Now that a local Supabase project has been initialized, the database can be connected to your remote project.

    To log into your Supabase account where the remote project is:

    <Your Repo Name>% supabase login
    

    The response will be a link to open in a browser.

    Hello from Supabase! Press Enter to open browser and login automatically.
    
    Here is your login link in case browser did not open https://supabase.com/dashboard/cli/login?session_id={...}
    
    

    The link will either prompt you to log in to your Supabase account or immediately redirect you to the authorization page. Once logged in, the authorization page will show a verification code. Copy this into your terminal.

    Enter your verification code: {AUTHCODE}
    
    Token cli_jessboctor@{machinename}.local_{...} created successfully.
    
    You are now logged in. Happy coding!
    

    Once you have logged in, you can link the local instance to a remote project:

    <Your Repo Name>% supabase link
    

    The response will be a list of the projects in your remote Supabase workspace. Use the ⬆️ and ⬇️ keys to highlight the project you want and press enter.

    Here, you may be asked for the database password for this project. This is not your user account password. You should have set the password when you created the Supabase project. If the password if valid, the supabase/config.toml file will be updated to reflect the project which you linked to.

    Step 5: Seed & Start the Database

    In order to set up with pre-populated data, you need to download a seed file from Supabase.

    <Your Repo Name>% supabase db dump --data-only > supabase/seed.sql
    
    

    This command will download a seed SQL file to your-repo-name/supabase/seed.sql

    To start the database:

    <Your Repo Name>% supabase start
    

    Once the servers have started, you should see a list of local URLs and ports which you can use for different things. For example, the “Studio” allows you to access a local version of the Supabase dashboard.

    Step 5: Update the environment keys

    Now that the local Supabase tables are running, we need to connect the app to the local tables rather than the remote ones.

    Open the Supabase Studio URL in a browser (it is most likely http://127.0.0.1:54323). When you see the Supabase dashboard, click on the “Connect” button in the upper right-hand corner.

    When the connect dialogue opens, click on the “App Frameworks” button. You need to copy the “{…}_Supabase URL” and “{…}_Supabase_Anon_Key” from the dialogue,

    In your code editor, open up the your-repo-name/integrations/supabase/client.ts file. We need to replace the Supabase_URL and Supabase_Publishable_Key values. The file should look like this:

    // This file is automatically generated. Do not edit it directly.
    import { createClient } from '@supabase/supabase-js';
    import type { Database } from './types';
    
    SUPABASE_URL = "<Your Remote Subabase URL>";
    SUPABASE_PUBLISHABLE_KEY = "<A really long string>";
    
    // Import the supabase client like this:
    // import { supabase } from "@/integrations/supabase/client";
    
    export const supabase = createClient<Database>(SUPABASE_URL, SUPABASE_PUBLISHABLE_KEY);
    
    

    You will want to replace the values with the ones you copied from the local Supabase Studio Connect:

    // This file is automatically generated. Do not edit it directly.
    import { createClient } from '@supabase/supabase-js';
    import type { Database } from './types';
    
    SUPABASE_URL = "http://127.0.0.1:54321";
    SUPABASE_PUBLISHABLE_KEY = "<A really long but different string>";
    
    // Import the supabase client like this:
    // import { supabase } from "@/integrations/supabase/client";
    
    export const supabase = createClient<Database>(SUPABASE_URL, SUPABASE_PUBLISHABLE_KEY);
    
    

    You will also need to either create a .env.local file to replace the values in the .env file, or replace them in the .env file directly.

    Step 6: Start the webapp

    Conveniently, the README.md file from Lovable should include instructions on how to stand up a local version of the web app which was connected to the remote version of the Supabase project.

    # Step 1: Clone the repository using the project's Git URL.
    git clone <YOUR_GIT_URL>
    
    # Step 2: Navigate to the project directory.
    cd <YOUR_PROJECT_NAME>
    
    # Step 3: Install the necessary dependencies.
    npm i
    
    # Step 4: Start the development server with auto-reloading and an instant preview.
    npm run dev
    

    Running npm run dev will respond with another local URL (http://localhost:8080/). When you load this URL in a browser, you should see a local version of your Lovable app!

    That’s it! You should be ready to go!

    Pro-tip: Lovable commits the .env file into the Git repo. However, keeping track of the environment variables is a pain, not to mention, can lead to accidentally committing your local Supabase_URL and Anon_key to the repo (ask me how I know).

    Here is the workaround I figured out:

    1. Remove .env from the git repo using git rm --cached .env
    2. Add .env to your .gitignore file and commit the change
    3. Edit the .env file to contain a environment variable which can be easily set to “true” or “false”: `VITE_IS_LOCAL=”true”
    4. Edit the integrations/supabase/client.ts file to set the SUPABASE_URL and SUPABASE_PUBLISHABLE_KEY based on the environment variable.
    // This file is automatically generated. Do not edit it directly.
    import { createClient } from '@supabase/supabase-js';
    import type { Database } from './types';
    
    let SUPABASE_URL = "";
    let SUPABASE_PUBLISHABLE_KEY = "";
    const isLocal = import.meta.env.VITE_IS_LOCAL === "true";
    
    if (isLocal) {
        SUPABASE_URL = "http://127.0.0.1:54321";
        SUPABASE_PUBLISHABLE_KEY = "<The long local string>";
    } else {
        SUPABASE_URL = "<Remote project URL>";
        SUPABASE_PUBLISHABLE_KEY = "<The long remote string>";
    }
    
    // Import the supabase client like this:
    // import { supabase } from "@/integrations/supabase/client";
    
    export const supabase = createClient<Database>(SUPABASE_URL, SUPABASE_PUBLISHABLE_KEY);
    
    

    Deploying Changes

    In order to keep the version control continuous between Lovable and Supabase, you need to commit changes to any supabase/migrations and supabase/functions twice. First, via Supabase CLI to push the changes to the remote database and then second in Git to push the changes to Lovable.

    You can find information Migrations and deploying changes here: https://supabase.com/docs/guides/deployment/database-migrations

    You can find information Edge Functions and deploying changes here: https://supabase.com/docs/guides/functions/quickstart-dashboard


    Does this process work for you? Got any great tips on how to make it work even better? Let me know!

  • Making a SCUD Plugin

    I am working with a company who needs a nicely formatted way to display a group of employees. Currently, the team page is using hardcoded [author] shortcodes to format the page, but this makes it a pain for the staff to update the team information.

    So they don’t.

    So, I am creating a Simple Custom User Display, or SCUD, plugin.

    Did you think I was talking about missiles? Also, did you know there is a type of baby shrimp called a scud? You can learn about them here.

    I am sure there are already WordPress plugins which have a similar functionality to display users in a page archive. However, in this case, the company that I am working with doesn’t need a bunch of bells and whistles. Since they don’t have someone who regularly handles updating the site, keeping things light and less likely to run into update conflicts would be best.

    Also, I haven’t built a plugin from scratch in a bit. The majority of my recent work has been in digging through legacy code and surgically making improvements and refactors. Building something new seems like fun.

    Before I started writing any code, I spent some time thinking through the plugin and what this company needed. I thought it might be an interesting exercise to share.

    Problem Statement:

    The “Company” needs to improve their current “Team” page. Currently, it is out of date and is not easily updated by the staff at the Company. The current page is making use of the [author] shortcode in order to format contact information which is hard coded into the page directly. The metadata for each team member is not being pulled from a custom post type or user profile. This means that in order to edit a single piece of information (e.g. update a team member title), the user must sort through the page code to make changes.

    Solution:

    Create a lightweight plugin which uses as much core WordPress functionality as possible to make each team member information its own dataset. This way, the team member information can be easily pulled, sorted, filtered, and displayed on the front end. To make this easily editable in the future, utilize WordPress users to contain the sales rep information and metadata.

    Information store: Users

    The current website displays the team members grouped by their team and state. We can maintain this information if we can create a taxonomy for users. This looks to be a simple case of registering a custom taxonomy on the user type

    After creating the taxonomies, we want to limit the capabilities of the team member users. We can do this by creating a custom user role “Team Member” and limiting what they have access to. The role needs to only be added during the activation hook of our plugin and then we can begin to assign users to the role.

    The last piece of information we need to include is custom user meta fields. The team members have extra information, like their title and regions, which we need to be able to save. We can include this in the edit_user_profile hook

    The end goal here is that any team member could log in to edit their own profile information. If they don’t have their login, an administrator would also be able to update the contact information for them.

    Display

    Option 1: DataViews (Stretch)

    WordPress has recently introduced DataViews to core. This allows data sets (such as users) to not only be displayed, but also searched, filtered, and sorted. 

    DataViews has multiple layout options, including a table, grid, or list. Since the DataViews fields each render the component in custom React, we can customize the layout of the user information and how it is displayed.

    Additionally, since not all “fields” have to be displayed to be used for filtering, we can use the taxonomies to filter the users.

    The goal here is to allow users to be added to the website and then be automatically added to the Sales Team page without needing to edit the page content itself.

    If we contained the DataViews within a Block (for Divi or Gutenberg) then we would also be able to add custom content above or below the block without having to touch code.

    Divi can now display Gutenberg blocks. So this means we only need to create the block as part of Gutenberg.

    Option 2: Page Template

    This is basically the same principle as option 1, but with a custom page template. This is less flexible and doesn’t allow for searching, sorting, or filtering.

    Option 3: Divi Drag and Drop Page

    This option would be the most similar to the current implementation (the layout being created within the page editor). While we would still have the benefits of better user information control, it wouldn’t solve the long term-ease of use problem. It would likely be the least time consuming though.

    I am developing a boiler plate version of this plugin here: https://github.com/JessBoctor/simple-custom-user-display

    The goal is not to create a WYSIWYG plugin which allows a user to install and customize the new user role and display. Rather, it is a lightweight plugin with enough instructions on how to customize things that a dev can pick it up and make it their own.

    Which is what I will do for the company I am working with 🙂

    PS Before and after images will be coming soon once I have the website refresh completed.

  • Calling for Backup

    I am currently working on a project to convert a list of active purchases from a custom React component to a built in WordPress component, DataViews. The goal of the project is to utilize the built in sorting, filtering, and pagination of DataViews to improve the user experience by making finding specific subscriptions in their account.

    GitHub Issue: https://github.com/Automattic/wp-calypso/issues/86616

    A big part of this project has been detangling functionality from class components so that the logic could be exported from the original PurchaseItem component to render within the columns of the DataViews table. I went through this process for each of the four columns in these PRs:

    Looks like I missed a notice when rendering the payment method. In the original PurchaseItem component, not only does the payment method type render, but also a notification about if there is a backup payment method.

    Production

    WIP DataViews

    While I was working on the CSS for the DataViews table, I realized that the notice was missing. Exporting the function for the notice and adding it to the payment-methods column wasn’t a problem. However, determining if the component should be rendered threw me a curveball.

    The parameter that determines if the component should be rendered is a const isBackupMethodAvailable. However, the value of this const isn’t available within the Purchases.Purchase type. It is actually set and passed to the PurchaseItem component one step higher in the chain, within the PurchasesSite component.

    { purchases.map( ( purchase ) => {
    	const isBackupMethodAvailable = cards.some(
    		( card ) => card.stored_details_id !== purchase.payment.storedDetailsId && card.is_backup
    	);
    
    	return (
    		<PurchaseItem
    			{... A Bunch of Props...}
    			isBackupMethodAvailable={ isBackupMethodAvailable }
    		/>
    	);
    } ) }
    

    At face value, this isn’t a hard fix. I could copy this logic setting isBackupMethodAvailable over to the PurchaseItemRowPaymentMethod component within the DateViews fields. This would allow me to check the two conditions for showing the notice:

    • Is the current payment method different from the one being used for the purchase?
      • We need the current payment method to be different from the payment method assigned to the purchase. If it isn’t, then the payment method isn’t really a back up–its just the payment method assigned to the purchase. Even if the current payment method is marked to be used as a backup for other purchases, if there isn’t a secondary backup payment method available, then this purchase doesn’t have a backup available.
    • Is the card set to be used as a backup payment method?

    If these two conditions are true, then there is a valid backup payment method and the notice is shown.

    The tricky part is when you start thinking about performance and scale. If a user only has a few purchases and one or two payment methods, then copying this logic over may not be a big issue and the user likely won’t see any drag.

    However, if the user has hundreds of purchases and lots of payment methods, this could cause a problem. The DataFields API renders one item at a time. Currently, the payment-method field uses the purchase as the item to render. Since this type doesn’t include the backup payment method information, I can’t just pass it in. So the approach of just copying and pasting this logic over would mean that for every row of the DataViews table, I would need to fetch all of the customer’s payment methods, filter through them to see if there are payment methods which are a backup but not assigned to the purchase, and then repeat it again in the next row.

    See how that could get expensive really quickly?

    To avoid this performance drain, I have two other options:

    1. Shove the isBackupMethodAvailable const into the Purchases.Purchase type
    2. Fetch the array of payment methods only once and pass it to getPurchasesFieldDefinitions via the usePurchasesFieldDefinitions hook

    I don’t love adding a parameter to a type for a single use case. So I am going with option 2.

    The first step is to pass the array of payment methods (StoredPaymentMethod) to the hook from the purchases list index file. This is the closest place where we have this context already available.

    // purchases-list-in-dataviews/index.tsx
    
    <PurchasesDataViews
    	purchases={ purchases }
    	translate={ translate }
    	paymentMethods={ this.props.paymentMethodsState.paymentMethods }
    />
    

    We then pass the data from the DataView, to the usePurchasesFieldDefinitions hook, and then finally to the getPurchasesFieldDefinitions function.

    // purchases-data-view.tsx
    
    export function PurchasesDataViews( props: {
    	purchases: Purchases.Purchase[];
    	translate: LocalizeProps[ 'translate' ];
    	paymentMethods?: <Array>StoredPaymentMethod;
    } ) {
    	// ...Other logic...
    	const purchasesDataFields = usePurchasesFieldDefinitions( paymentMethods );
    	// ...More logic...
    }
    
    // use-field-definitions.ts
    
    export function usePurchasesFieldDefinitions( paymentMethods) {
    	// Do some stuff with the payment methods
    	// Other hook logic
    }
    

    Now, remember, there are two conditions for a purchase to be considered as “having a backup payment method”. The first is that the id of the StoredPaymentMethod for the purchase cannot match the backup method, and the second is that the StoredPaymentMethod has to be set to be a backup.

    Let’s flip the order in which those items are checked. This way, we can send only the StoredPaymentMethods which are a backup to getPurchasesFieldDefinitions. When each row compares the payment method assigned to that purchase, it will have a smaller scope of payment methods to test against since we have already weeded out any payment methods which aren’t meant to be used as a backup.

    const backupPaymentMethods = paymentMethods.filter(
    	( paymentMethod ) => paymentMethod.is_backup === true
    );
    

    Once we pass that back to getPurchasesFieldDefinitions, we can now render the backup payment method notice by checking if the current purchase’s payment method is the only payment method found in the backupPaymentMethods array.

    let isBackupMethodAvailable = false;
    
    if ( backupPaymentMethods ) {
    	backupPaymentMethods.filter(
    		( paymentMethod ) => item.payment.storedDetailsId !== paymentMethod.stored_details_id
    	);
    
    	isBackupMethodAvailable = backupPaymentMethods.length >= 1;
    }
    

    Now, we have checked for both conditions and saved ourselves from having to fetch a set of payment methods multiple times.

    Thoughts? Comments? Concerns? Have a better way to address the problem? Let me know in the comments.

    If you want to see the full work in progress, check out the PR here: https://github.com/Automattic/wp-calypso/pull/102742