Friday Finking: Beyond implementing Unicode
Unicode has given us access to a wealth of mathematical and other
symbols. Hardware and soft-/firm-ware flexibility enable us to move
beyond and develop new 'standards'. Do we have opportunities to make
computer programming more math-familiar and/or more
logically-expressive, and thus easier to learn and practice? Could we
develop Python to take advantage of these opportunities?
TLDR;? Skip to the last paragraphs/block...
Back in the ?good, old days, small eight-bit computers advanced beyond
many of their predecessors, because we could begin to encode characters
and "string" them together - as well as computing with numbers.
Initially, we used 7-bit ASCII code (on smaller machines - whereas IBM
mainframes used EBCDIC, etc). ASCII gave us both upper- and lower-case
letters, digits, special characters, and control codes. Later this was
extended to 8-bits as "Code Page 1252", whereby MSFT added more special
characters, superscripts, fractions, currency symbols, and many ordinary
and combinatorial letters used in other "Romance languages" (European).
Latterly, we have implemented Unicode, which seeks to include all of the
world's scripts and languages and may employ multiple bytes per
A massive effort went into Python (well done PyDevs!), and the adoption
of Unicode in-particular, made Python 3 a less-than seamless upgrade
from Python 2. However, 'standing upon the shoulders of giants', we can
now take advantage of Unicode both as an encoding for data files, and
within the code of our own Python applications. We don't often see
examples of the latter, eg
>>> ? = 3.14159
>>> r = 1
>>> circumference = 2 * ? * r
>>> print( circumference )
>>> Empf?nger = "dn" # Addressee/recipient
>>> Stra?enname = "Lansstra?e" # Street name
>>> Immobilien_Hausnummer = "42" # Building/house number
(whilst the above is valid German, I have 'cheated' in order to add
suitable characters - for the purposes of illustration to
EN-monolinguals - apologies for any upset to your sense of "ordnung" -
please consider the meaning of "42" to restore yourself...)
However, we are still shackled to an history where an asterisk (*) is
used as the multiplication symbol, because "x" was an ASCII letter.
Similarly, we have the ** for an exponential operator, because we didn't
have superscripts (per algebraic expression). Worse, we made "=" mean:
'use the identifier to the left to represent the right-hand-side
value-result', ie "Let" or "Set" - this despite left-to-right expression
making it more logical to say: 'transfer this (left-side) value to the
part on the right', ie 'give all of the chocolate cake to me', as well
as 'robbing' us of the symbol's usual meaning of "equality" (in Python
that had to become the "==" symbol). Don't let me get started on "!"
(exclamation/surprise!) meaning "not"!
There is/was a language called "APL" (and yes the acronym means "A
Programming Language", and yes it started the craze, through "B" (and
BCPL), and yes, that brought us "C" - which you are more likely to have
heard about - and yes then there were DataSci folk, presumably more
numerate than literate, who thought the next letter to be "R". So, sad!?).
The point of mentioning APL? It allowed the likes of:
APL was hopelessly keyboard-unfriendly, requiring multiple key-presses
or 'over-typing' to produce those arithmetic-operator symbols -
remember, much of this was on mainframe 3270-style terminals, although
later PC-implementations have existed (I can't comment on how 'active'
any community might be). The over-typing was necessary to encode/produce
the APL symbols which don't exist on a standard typewriter keyboard. Ugh!
I'm glad to have limited my APL-exposure to only reading about it during
a 'Programming Languages' topic! (If you are 'into' functional
programming you may like to explore further)
Turning now to "hardware" and the subtle 'limitations' it imposes upon us.
PC-users (see also Apple, and glass-keyboard users) have become wedded
to the 'standard' 101~105-key "QWERTY"/"AZERTY"/etc keyboards (again,
restricting myself to European languages - with due apologies). Yet,
there exists a variety of ways to implement the 'standard', as well as a
range of other keyboard layouts. Plus we have folk experimenting with
SBCs, eg Raspberry Pi; learning how to interpret low-level hardware, ie
key-presses and keyboard "arrays", and developing/innovating all-manner
of special interfaces and other tools.
Back to Python, or is it 'forward to Python':
In the same way that we often add a "numeric key-pad" to the standard
'typing' alphanumeric keyboard, could we add another key-pad or row of
keys? Might such give us the opportunity to start employing other, and
more logical symbols for programming, eg directional arrows, a modifier
key to enable subscripts and another for superscripts, Greek/Math
symbols to release us from the approximations/alternatives such as
asterisk-means-multiply, and so-on...?
Could we then also 'update' Python, to accept the wider range of symbols
instead/in-addition to those currently in-use?
Would such even constitute 'a good idea'?
Web.Refs on hardware:
Web.Refs on the rest: