Right, I'll try to continue that later. However you have to assume that there is a "right sequence" to understand this sort of thing, and that that sequence has been arrived at largely by trial and error over an extended period.
Noting that you're digging into books etc., I'll try to make a few more general points and then stand back for a while.
Computer programming is not maths, in the same way that maths is not arithmetic. Also it's not logic, in the sense that logic is understood by lawyers etc.
Neither is it an art: assumptions that "following ones nose" or "gut feeling" will result in reliable code- code that both works for five minutes and keeps on working if asked to run for hours or months- are almost always unsafe. Hence, in part, the grief I gave you earlier over your "it ought to..." position :-)
Neither is it pure engineering: there are few if any branches of engineering which have not been subjected to robust mathematical analysis and conceptual proof, but software generally stands outside that because of the extreme difficulty of quantifying requirements and behaviour, and the extent to which the complexity of modern components stands outside formal proof (the gates in a CPU are a whisker away from being perturbed by quantum effects).
Pascal, as a language, started off as a one-man project by a compiler writer angered to the point of derangement. By the 1990s the major implementations had matured to be at least as good as other general-purpose languages of the day, but after that it has suffered from an explosion of both good ideas and- to be frank- computer science crap which has left it with (a) some of the more difficult to use concepts sidelined and (b) multiple ways to do quite simple jobs which have been implemented with no regard for efficiency.
Looking at point (a) first, if you refer to older books on Pascal they will show you how to allocate blocks of memory from what is known as the heap, refer to them via pointers, and free them when no longer needed. To a very large extent that has been replaced by the creation of instances of classes which can be referred to like any other variable, you do still need to free them when no longer needed but this is much less painful than the original. (The water is muddied by the fact that there are also objects and advanced records with much the same functionality).
In addition, relatively recent compilers have relaxed some of the ordering requirements: while it is still necessary to declare anything before it is used, there is no longer a requirement that all constants be declared before all types before all variables: by and large that's to be applauded. Don't, however, assume that you can declare a variable inside code (as you can with C/C++ etc.): all declarations have to be /before/ the code block that will use them.
Looking at point (b), Pascal started off with strings of up to 255 characters where each character was a single byte, a string of up to n characters was declared like
var something: string[n];. The water has been muddied enormously by the introduction of strings that can not only store more than 255 characters declared like
var something: string;, but by the fact that each character might be two or more bytes in order to represent characters outside the original 7-bit US-ASCII range. The result of this is that any attempt to step through such a string one byte at a time will not behave how you want, which is one of the reasons why I have misgivings about your use of a Unicode private area.
There are of course ways round this: several different fundamental character and string types, directives etc. to specify the current codepage when this needs to be known, and a multiplicitly of library functions to protect the user from himself.
On a related point, there are now dynamic arrays which can contain an arbitrary number of elements... plus open arrays, plus ways of retrieving some of the information that the compiler and runtimes use to describe them. They're undoubtedly extremely useful but there's pitfalls... hence in part some of the comments made about certain types of storage being "good up to a 1000 elements or so" earlier.
Now, depending on exactly what books etc. you have available to you, you might or might not find the above points mentioned. Older books- including the foundation Jensen & Wirth which you will at least see referenced- will emphasise manual allocation and deallocation of memory and manual creation of data structures (lists, trees and so on): these are things that you need to know exist, but not ones that you should assume you will be using routinely. Newer books might go overboard about doing absolutely everything by creating small objects and passing them around to mimic the "functional" style of programming: that's all very well, until you start looking at the resultant overheads which might result in something far less efficient than "classical" code. And books focussing on Delphi might give you information on strings which is misleading in the context of Lazarus/FPC.
If in doubt, ask: the community is here to help. But it doesn't like being hectored, and told that something /must/ work and /must/ be the appropriate way when that is quite simply not the case :-)
Hoping that my notes are, to some small extent, useful. If nothing else, print them and use them as bookmarks :-)
MarkMLl