- đ¶ Confluence.VC Weekly
- Posts
- đ¶ Five steps to building a web app without writing a line of code
đ¶ Five steps to building a web app without writing a line of code
How you can use the same logic to build software in a day
Automation > everything
If youâre an information worker and serious about work, you HAVE to be using technology as your friend.
Iâm not talking about social media. Iâm talking about actual software that you can use to run automations, save you time, and 10x your output. If you arenât learning the skills to operate like this, you're NGMI.
Itâs never been easier to build things. You donât need to learn computer science, you donât need to break the bank on a degree, and you donât need to continually outsource work to somebody else that can code.
You need personal accountability and the ability to use logic.
I built this, I canât code. You have no excuses.
Here are the steps I took to create this in a day so that you can do the same. If this is helpful, let me know, and Iâll start creating more walkthroughs like this.
Tools you need
Airtable subscription: get started for free
Softr subscription: free forever with 30-day free trial on any paid plan
Simplescraper: get started for free
Zapier subscription: get started for free
Google Sheets: free
Urlbox subscription (optional): starts at $10 / month
Ideation and background
I never liked manually sourcing companies to evaluate, having multiple browser windows open distracts me, and I like centralizing information.
I figured that other investors have similar feelings, so I wanted to create a deal flow scraper that pulls in startup data from some of my favorite databases.
That was the genesis of this idea, so I built this to test it.
Step 1: determine sources of information
I decided not to reinvent the wheel here.
When I look for new startups, I essentially scan four different sources for information:
There are countless of other places to add, but these are the ones I focused on for the first version of this project.
Step 2: find what you want to gather from listing websites
Listing websites (for the most part) will only include a certain amount of information on their landing page (they want you to click in the find the rest of the information).
To understand what you can pull from a website, you first have to understand what elements are actually included on the website. These elements live inside the HTML code of the website, and you can find this code by right-clicking on something within a page and clicking 'Inspect Element' (should also be able to press Command+Option+i on your Mac or F12 on your PC). From this view, youâll be able to see what HTML elements youâll be able to pull from each page. This includes things like a title, description, and URL.
If you're non-technical and HTML seems scary, it's not. Take five minutes to read this overview, and you'll be good to go.
Step 3: scrape a website without coding
We tested a couple of different pieces of software before landing on Simplescraper. Itâs a Chrome plugin, incredibly easy-to-use, and it matched our budget ($35/month for paid plan).
Once we downloaded the plugin, it was pretty straightforward how to get information pulled in.
Steps:
Go to the site you want to scrape (using Google Chrome as browser).
Click into Simplescraper plugin and select âScrape this websiteâ.
Create property and give it a name.
Hover over the property you want to scrape into Google Sheets.
Click into the property, review previewed results, then click the check mark.
Once youâre done scraping properties, click âView resultsâ.
After being redirected to the results page, click âSave recipeâ.
Give your recipe a name, make sure all of the information is correct, then click into show advanced options.
Schedule scraper to run once a day.
Keep the rest of the settings untouched.
Create recipe and go click into your new recipe shown on the left hand side of your screen.
Click into âIntegrateâ then toggle on Google Sheets.
Now youâve built a basic scraper, and every day the property data from the website you chose in step 1 will pull into the Google Sheet you created in step 12.
Step 4: transform your data
Youâve done the hard work of setting up a system to consistently pull data. Now you want an easy way to make that data presentable.
We recommend doing this by transferring your Google Sheet data over to Airtable, cleaning it up with add ons (optional), and then pulling that cleaned data into Softr.
Hereâs how we recommend doing that.
Google Sheets âĄïž Airtable
What you need for this step:
An Airtable base with the columns mapped to match your Google Sheet
A Zapier subscription (if you arenât already using Zapier, hopefully this changes that)
If youâre familiar with Zapier and no-code automation, this step is straightforward. If you arenât familiar with Zapier, theyâll help make this easy once you log into your new account.
The logic is that whenever a new spreadsheet row is created in the Google Sheet for your scraper output, that record will be automatically synced to your Airtable base. When you log into Zapier, youâll need to create a new Zap. Hereâs what you input when instructed to do that:
The trigger: New spreadsheet row created in Google Sheets
The action: Create new record in Airtable
Alternatively, you can just copy the Zap weâve already made (linked HERE).
Adding images (optional)
Some sites make it hard to scrape image data, so this is a workaround if presentability of your data matters to you. Skip to the next section if thatâs not you.
What you need for this step:
A subscription to Urlbox
Create a new column in your Airtable base to show images. Make this column an âAttachmentâ field.
The trigger: New record is created in Airtable
The action #1: Generate screenshot URL from Urlbox
Output file type: PNG
Viewport Width: 320
Viewport Height: 600
Hide cookie banners: true
Ratina: false
The action #2: Update record in Airtable
Update the empty product image field to now include the screenshot URL created by Urlbox
Alternatively, you can just copy the Zap weâve already made (linked HERE).
Step 5: putting a front end on your cleaned data
What you need for this step:
If youâre still following along, youâve found a website to scrape, youâve developed a system to consistently scrape website data into a Google Sheet, and youâve transformed that data into something presentable by adding images if they werenât otherwise available.
Now comes the fun part to make all of that hard work look like a real application.
We recommend using Softr for this step. We started using their software two months, and itâs given us superpowers.
Finding a template
After you create an account (they offer free trials), youâll need to select an existing template to get started.
This is a matter of personal preference, but if you plan on building something similar to this, youâll want something with lists built in. Most of these are in the âResource Directoriesâ section of the templates.
Connecting Softr to Airtable
When you click into your new app, it will be filled with dummy data.
To change this, youâll need to click into the list section within Softr, click into âDataâ on the right side of the screen, and connect to your Airtable base you built in the previous step.
If you are using Softr for the first time, they will ask you for your Airtable API key to authenticate yourself first. If you're struggling to get that set up, Softr has made a walkthrough explainer, and I've linked it here.
Mapping your data in Softr
Almost done. Now you have the right data in Softr, and you just need to map it correctly.
In the list section of Softr, go into âFeaturesâ and for each of the items fields, map to the correct data, re-name it show it has the correct label, and remove any unnecessary items.
If you have a button or something similar and want to redirect people to a URL when they click into that button, you have two options:
You can reroute to whatever URL you are capturing in your database if you are scraping this as a property. This is the easy option.
You can build a separate page in Softr that letâs you click into any company and see more information about that specific company. This is the harder option, but it is worth doing if you want whoever your users are to stick around linger (versus exiting to the company URL).
Thatâs it! Youâve now taken a project from 0 to 1, and you have a tool you can use to automate the collection of whatever information you want.
If you need help automating your systems, I can help. Just respond to this email, and we can figure something out.
đ Featured jobs
Jasper is looking for a remote revenue sales analyst.
Surfer is looking for a remote product designer.
Odin is looking for a growth and partnerships lead in London.
Sydecar is looking for a remote marketing manager.
Links we like
This week's episode is brought to you by:
Softr: The easiest way to build professional web apps on top of Airtable
If you're using Airtable to store data, you HAVE to layer Softr on top.
Their software let's you turn your ugly databases into beautiful web apps. We've used Softr to build our investor directory, public roadmap, and the Signal Tracker that this newsletter walked through how to build.
Reply