đŸ“¶ Five steps to building a web app without writing a line of code

How you can use the same logic to build software in a day

Automation > everything

If you’re an information worker and serious about work, you HAVE to be using technology as your friend.

I’m not talking about social media. I’m talking about actual software that you can use to run automations, save you time, and 10x your output. If you aren’t learning the skills to operate like this, you're NGMI.

It’s never been easier to build things. You don’t need to learn computer science, you don’t need to break the bank on a degree, and you don’t need to continually outsource work to somebody else that can code.

You need personal accountability and the ability to use logic.

I built this, I can’t code. You have no excuses.

Here are the steps I took to create this in a day so that you can do the same. If this is helpful, let me know, and I’ll start creating more walkthroughs like this.

Tools you need

Ideation and background

I never liked manually sourcing companies to evaluate, having multiple browser windows open distracts me, and I like centralizing information.

I figured that other investors have similar feelings, so I wanted to create a deal flow scraper that pulls in startup data from some of my favorite databases.

That was the genesis of this idea, so I built this to test it.

Step 1: determine sources of information

I decided not to reinvent the wheel here.

When I look for new startups, I essentially scan four different sources for information:

There are countless of other places to add, but these are the ones I focused on for the first version of this project.

Step 2: find what you want to gather from listing websites

HTML code for BetaList

Listing websites (for the most part) will only include a certain amount of information on their landing page (they want you to click in the find the rest of the information).

To understand what you can pull from a website, you first have to understand what elements are actually included on the website. These elements live inside the HTML code of the website, and you can find this code by right-clicking on something within a page and clicking 'Inspect Element' (should also be able to press Command+Option+i on your Mac or F12 on your PC). From this view, you’ll be able to see what HTML elements you’ll be able to pull from each page. This includes things like a title, description, and URL.

If you're non-technical and HTML seems scary, it's not. Take five minutes to read this overview, and you'll be good to go.

Step 3: scrape a website without coding

We tested a couple of different pieces of software before landing on Simplescraper. It’s a Chrome plugin, incredibly easy-to-use, and it matched our budget ($35/month for paid plan).

Once we downloaded the plugin, it was pretty straightforward how to get information pulled in.

Steps:

  1. Go to the site you want to scrape (using Google Chrome as browser).

  2. Click into Simplescraper plugin and select ‘Scrape this website’.

  3. Create property and give it a name.

  4. Hover over the property you want to scrape into Google Sheets.

  5. Click into the property, review previewed results, then click the check mark.

  6. Once you’re done scraping properties, click ‘View results’.

  7. After being redirected to the results page, click ‘Save recipe’.

  8. Give your recipe a name, make sure all of the information is correct, then click into show advanced options.

  9. Schedule scraper to run once a day.

  10. Keep the rest of the settings untouched.

  11. Create recipe and go click into your new recipe shown on the left hand side of your screen.

  12. Click into ‘Integrate’ then toggle on Google Sheets.

Now you’ve built a basic scraper, and every day the property data from the website you chose in step 1 will pull into the Google Sheet you created in step 12.

Step 4: transform your data

You’ve done the hard work of setting up a system to consistently pull data. Now you want an easy way to make that data presentable.

We recommend doing this by transferring your Google Sheet data over to Airtable, cleaning it up with add ons (optional), and then pulling that cleaned data into Softr.

Here’s how we recommend doing that.

Google Sheets âžĄïž Airtable

What you need for this step:

  • An Airtable base with the columns mapped to match your Google Sheet

  • A Zapier subscription (if you aren’t already using Zapier, hopefully this changes that)

If you’re familiar with Zapier and no-code automation, this step is straightforward. If you aren’t familiar with Zapier, they’ll help make this easy once you log into your new account.

The logic is that whenever a new spreadsheet row is created in the Google Sheet for your scraper output, that record will be automatically synced to your Airtable base. When you log into Zapier, you’ll need to create a new Zap. Here’s what you input when instructed to do that:

The trigger: New spreadsheet row created in Google Sheets

The action: Create new record in Airtable

Alternatively, you can just copy the Zap we’ve already made (linked HERE).

Adding images (optional)

Some sites make it hard to scrape image data, so this is a workaround if presentability of your data matters to you. Skip to the next section if that’s not you.

What you need for this step:

  • A subscription to Urlbox

  • Create a new column in your Airtable base to show images. Make this column an ‘Attachment’ field.

The trigger: New record is created in Airtable

The action #1: Generate screenshot URL from Urlbox

  • Output file type: PNG

  • Viewport Width: 320

  • Viewport Height: 600

  • Hide cookie banners: true

  • Ratina: false

The action #2: Update record in Airtable

  • Update the empty product image field to now include the screenshot URL created by Urlbox

Alternatively, you can just copy the Zap we’ve already made (linked HERE).

Step 5: putting a front end on your cleaned data

What you need for this step:

If you’re still following along, you’ve found a website to scrape, you’ve developed a system to consistently scrape website data into a Google Sheet, and you’ve transformed that data into something presentable by adding images if they weren’t otherwise available.

Now comes the fun part to make all of that hard work look like a real application.

We recommend using Softr for this step. We started using their software two months, and it’s given us superpowers.

Finding a template

After you create an account (they offer free trials), you’ll need to select an existing template to get started.

This is a matter of personal preference, but if you plan on building something similar to this, you’ll want something with lists built in. Most of these are in the ‘Resource Directories’ section of the templates.

Connecting Softr to Airtable

When you click into your new app, it will be filled with dummy data.

To change this, you’ll need to click into the list section within Softr, click into ‘Data’ on the right side of the screen, and connect to your Airtable base you built in the previous step.

If you are using Softr for the first time, they will ask you for your Airtable API key to authenticate yourself first. If you're struggling to get that set up, Softr has made a walkthrough explainer, and I've linked it here.

Mapping your data in Softr

Almost done. Now you have the right data in Softr, and you just need to map it correctly.

In the list section of Softr, go into ‘Features’ and for each of the items fields, map to the correct data, re-name it show it has the correct label, and remove any unnecessary items.

If you have a button or something similar and want to redirect people to a URL when they click into that button, you have two options:

  1. You can reroute to whatever URL you are capturing in your database if you are scraping this as a property. This is the easy option.

  2. You can build a separate page in Softr that let’s you click into any company and see more information about that specific company. This is the harder option, but it is worth doing if you want whoever your users are to stick around linger (versus exiting to the company URL).

That’s it! You’ve now taken a project from 0 to 1, and you have a tool you can use to automate the collection of whatever information you want.

If you need help automating your systems, I can help. Just respond to this email, and we can figure something out.

📌 Featured jobs

Jasper is looking for a remote revenue sales analyst.

Surfer is looking for a remote product designer.

Odin is looking for a growth and partnerships lead in London.

Sydecar is looking for a remote marketing manager.

Links we like

This week's episode is brought to you by:

Softr: The easiest way to build professional web apps on top of Airtable

If you're using Airtable to store data, you HAVE to layer Softr on top.

Their software let's you turn your ugly databases into beautiful web apps. We've used Softr to build our investor directory, public roadmap, and the Signal Tracker that this newsletter walked through how to build.

Reply

or to participate.