Today, in July of 2019, there are many people wanting to write web apps. The variables that need to be considered are the output we want on the screen and the data that makes that output dynamic combined with calculations we make to add value to raw data. We might sort and filter data so you can easily slice and dice data to gain better understanding and insight.
Sometimes the output is dynamic - like when customers are surveyed and they provide feedback. Sometimes the input data doesn't change as often. Our jobs as developers is to figure out what makes an interaction meaningful. Because so many people have been working on these same problems, patterns emerge.
When one pie is prepared in a kitchen, it's not so obvious the important moves to consider. But, if you make 1,000 pies, you'll understand how to better design a working kitchen to manufacture pies.
The patterns in web and mobile development have emerged with millions of sites. As languages are employed to produce the interactive results, problems arise. Smart people see the problems as opportunity and create solutions to make our lives easier. These solutions result in tooling, deeper thinking, and ultimately, progress.
To understand code, we must learn patterns and APIs and reason about them in our own minds to grasp the logic. In spoken language, object->label mapping is easy. I point to a cup and say: 'cup'. You start to understand. I say, pour the water in the cup. And, since you already know 'water' and 'pour' and 'into' -- you can easily piece together how the cup fits in. Use that a couple of times and the pattern will take up residence in your long-term memory.
Unless you write code frequently, these concepts and man-made labels and patterns don't move-in quickly. The longer you program, the less hesitant you are to allow them in simply because you know they'll be leaving soon, in favor of new developments. This leads to the popularity of using Google to code. Before you learn to Google, however one must really understand the fundamentals.
In any code, there are fundamentals.
Computers are enabled with electricity and memory and computation units and built in control software.
To control the computer, we use software to read and write to and from memory. As we manipulate values in and out of memory locations, we create logical calculations. The calculations include basic math operations and easy decisions based on true/false values. On top of these basics, there are human errors we need to account for. We may write code that waits for a response from a computer in another part of the world. While we're waiting, an earthquake has disrupted the communication towers that send signal. Unless we account for such possibilities, our code might stop working. We call these errors. The waiting and watching is enabled with run-time loops that keep the computer ready for any input.
From this, we can understand that any line of code can only ask these simple questions that you as the programmer can answer:
What's the current state of memory?
What values are we working with?
Are we reading or writing?
Are we making decisions?
Are there any errors?
State : Values : ReadWrites : Decisions : Errors : WaitWatch
These are the elements we have to understand deeply.
I'm writing to learn, not to teach. How could I learn more on the topic I've written?