- fixed Netlify deployment
- hooked up Netlify build with API (this was insanely easy)
- created event-detail-page
- create a test inserting data using strapi API
- create scraper.js with cheerio
- create a scraper user and authenticate in scraper.js
- build in a webhook to netlify after succesful scrape
Let's see if I can get this to work today. That would be awesome and I could focus on the front-end. I'm afraid though I'll probably have some technical difficulties as programming is always at least 200% harder than I think it will be (and this is always reflected in my estimations for clients too).
Did some behind the scenes reading of Gatsby (literally):
I tried to sum up what I read and Jason Lengstorf was nice enough to confirm my little summary:
Gatsby is based on 'nodes', it's basically modelled as json (or actually is json)
Source plugins gather data from different sources to add ’nodes’. During the build process, Gatsby gathers these nodes which can be results from fetched API’s, markdown or a database and I guess the result is a (huge) json data model.
In components we can use the graphql schema to interface with this huge model of data.
-- insert screaming 'YES' gif here -- (new feature request @basilesamel ? 😂)