WalterBright 2 days ago

Back in 1978, I made my own keyboard for a single board 6800 computer I designed, also because I could not afford a keyboard.

I went to a surplus store and bought an EBCDIC keyboard for a couple bucks. I unsoldered all the keys from the circuit board. I took a plastic board, and using the old circuit board, drilled holes in it. Inserted the keys in the holes, and then wired it up in an 8x8 grid pattern. The two 8 bits gave 64 possible keys, which was enough, connecting those to an I/O port enabled recognizing which key was down.

It worked fine as long as you were careful not to press more than one key at a time.

I don't recall what I did with that computer. It's all gone, including the design notebook for it.

  • ChuckMcM a day ago

    Ah the memories eh? I had a Digital Group Z80 system with a keyboard that encoded the keypress as an 8 bit value that could be read by the computer It was 6 bits of key press and one bit for shift and one bit for control. I actually know where it is though, it's in a computer museum in Germany (long story).

    • jbeninger a day ago

      I read a tongue-in-check short story about an elf walking through a human museum.

      "We think this urn was once used in agricultural ceremonies"

      "Bitch, that's my coffee pot!"

      Reading comments from older devs sometimes gives me the same vibes.

      • ChuckMcM 21 hours ago

        My wife was looking at the mini REI museum of camping gear and her back packing camping stove (a SVEA 123) which she still uses, was identical to one of the artifacts on display.

      • betterThanTexas a day ago

        Making coffee is an agricultural ceremony!

        • cryptonector a day ago

          That keyboard? I used it to kick off my coffee machine every morning.

  • YZF a day ago

    I wanted to make my own joystick for a ZX-81 but at the time didn't know enough to decipher the expansion port signals. Address/data didn't make any sense to me . At least I didn't fry it.

  • Dylan16807 2 days ago

    Could the code subtract out the previous key for two overlapping key presses, or was it a very strict one at a time?

    • throwanem 2 days ago

      If I had to guess, the sixty or so diodes required for an NKRO matrix might have blown the money budget for the project, about like trying to do that kind of work in an interrupt handler would blow the cycle budget for any interesting program even had it been possible. Typing more slowly is free.

      Hard to say. I would need to look up component prices for a few years prior to my birth, and it would take me a few minutes to find that archive of 70s Radio Shack catalogs again - though I believe it was actually posted here, so. Of course anyone serious enough to be building an entire computer in those days, brilliant keyboard hack and all, probably wouldn't be sourcing a jellybean part like 1N400x signal diodes one by each...

      • WalterBright 2 days ago

        The keys I used weren't jellybean. They were excellent keys. They were in the surplus bin because EBCDIC keyboards were already obsolete by then.

        I had very little money at the time, and scrounged for parts.

        • throwanem 2 days ago

          By "jellybean" I meant the sort of small signal diodes one might use to prevent aliasing in a keyboard matrix. I know where I'd have gone to get those in my twenties, but not really how I'd have afforded that many for one project, even at the (very) small discount they might have given me on an order of that size.

          Unfortunately, now that I've finally reached a point of being able to really effectively use such a resource as that store, in its place now stands a Sonic drive-in. So it goes.

    • WalterBright 2 days ago

      Many keyboards at the time could not do N-key rollover, including mine :-)

      Subtracting wouldn't work, as more than one combination of keys would produce the same 16 bits of signal.

      • Dylan16807 2 days ago

        No, no, I didn't say N, I said two.

        If you're holding one key, you have 1 bit set on each bank. If you press a second key, you now have an additional row bit and/or an additional column bit. You can tell what the new key is unambiguously. If it's still one bit on an axis then that bit is correct. If it's two bits then the new bit is correct.

        You only get problems when you have three keys held at the same time, or if multiple keys change state simultaneously.

        • WalterBright 2 days ago

          Consider a grid:

              -+-+-- A
               | |
              -+-+-- B
               | |
               C D
          
          The keys AC and BD pressed simultaneously are indistinguishable from BC and AD.
          • Dylan16807 2 days ago

            Yes, I mentioned the timing.

            But allowing for one key and then a second key before you release the first one is a pretty big improvement for natural typing.

            • WalterBright 2 days ago

              Since it was polling the keyboard, among other tasks, the timing of when it looked was not entirely predictable. I was pushing the 6802 at its limit.

              At least the keys did not need debouncing, they were nice keys.

              The code listing has disappeared, too.

Syzygies 2 days ago

As a grad student I took out a student loan for the equivalent of a year's stipend, to buy an Apple II in 1980. Within a week I voided the warranty (?) by carrying out a "shift key modification" that involved cutting a trace on the motherboard.

It worked, so I'm a bit baffled by Woz's explanation.

The Apple II didn't really advance my math research (that would be the later 128K Macintosh, to which we ported the Macaulay computer algebra system), but various friends learned computers at my apartment, shaping their later careers. The Apple II remains my only computer whose memory layout one could understand byte for byte.

aantix 2 days ago

My first home computer was an Apple 2+. That was 1992 - so it was old and dated then, but my parents got it for $150, which was the right price range at the time for the family.

Luckily the guy selling included all sorts of manuals with it, which helped me to learn programming.

I remember a terminal program I used to connect to local BBSes - I would press the forward key and that would inverse the capital character display on the screen (white bg, black letter), denoting that a capital character was sent, not lower case.

kragen 2 days ago

> So my TV Terminal, for accessing the ARPAnet, was uppercase only.

I never realized that the most influential personal computer was specifically designed to access what would later become the internet. That's astounding.

(The first internetworking experiments that I can find records of were done as part of the ARPANet project, and some of the protocols and even many of the port numbers we use today in TCP/IP are from ARPANet.)

I also had no idea that Woz had built his own CPU out of discrete logic in 01970. I still haven't done that myself 55 years later!

  • bregma a day ago

    01970 is not a valid octal number. Perhaps you meant 0x7B2?

    • kragen a day ago

      It is in K&R.

  • adastra22 2 days ago

    I believe the internet is a direct continuation of ARPANet. It just got renamed at some point.

    • kragen 2 days ago

      The ARPAnet (the electronic system) was never an internet (or catenet). It predated internetworking, and it was one of the networks comprising the internet we are using now (specifically, it was 10.0.0.0/8) until its demise in 01990. As a social institution, though, you are correct: the internet we are using now started out as an experiment inside the ARPAnet project.

      • adastra22 14 hours ago

        I think you are making a distinction without a difference.

        • kragen 4 hours ago

          Then you are a laptop.

      • pbjtime 2 days ago

        ...what year are you from?

        • bytesandbots a day ago

          The long now foundation

          > established in 01996 to foster long-term thinking. Our work encourages imagination at the timescale of civilization

          • kragen a day ago

            Disclaimer: I am not affiliated with the Long Now Foundation, but I endorse the suggestion to read about them.

zabzonk 2 days ago

I had a Dragon32 in the early 1980s. This had a (for the time) good keyboard, a terrible display, and was only uppercase. I bought it because I was interested in the 6809 processor (addressing modes gone mad!). But I think some effort between Tandy and maybe Dragon (the Dragon was based on the Tandy Color computer) things could have worked out better for both.

  • sixothree 2 days ago

    Really if it wasn’t for the ugly colors that computer might have had a chance.

  • bsder 2 days ago

    > (addressing modes gone mad!)

    With good reason. The 6809 was the 6800 with all the idiocies cleaned up. And I'm pretty sure that it is still the only 8-bit micro with genuinely relocatable/position independent code (aka doesn't need a linker pass!)

  • Mountain_Skies 2 days ago

    Late production runs of the Color Computer 2 had the capability of doing true lowercase. Not sure if Dragon was still making computers by then, but it would have been a simple part swap as both VDCs were from Motorola.

Asraelite a day ago

In the game LittleBigPlanet you can send digital signals between logic gates which are internally 32-bit floats. Because of the game's mechanics, the most data you can reasonably store in them is 24 bits using just the mantissa.

There's a small community of me and a few other people that like to build computers in the game, kind of like the redstone computers in Minecraft. The most efficient architecture is a 24-bit word size, so it makes sense to split that in 4 and use 6 bits per character. Because of that, some of the computers in the game use the Apple II charset.

kazinator 2 days ago

I had a Apple ][+ clone which had no problem with lower case, both display and keyboard. It also featured a numeric keypad.

It was made in Hong-Kong. The ROM announced itself as a "V.S.C. 1203".

kristianp 2 days ago

What's a good book to read about the early days of Apple? I enjoy these stories but this is necessarily skipping a lot of the story of the apple I and II. A search brings up "The Little Kingdom" by Moritz, but it was published in 1984 so it may include the Mac also released in '84 [1].

[1] https://en.wikipedia.org/wiki/Macintosh_128K

  • codazoda 2 days ago

    I bet Woz’s latest book, which I think is iWoz, probably has some stories (though it probably covers a later time).

  • cancerhacker 2 days ago

    Woz gets some coverage in Steven Levy’s “Hackers: heroes of the computer revolution’from 1984, and he updated it in 2014. I try to reread it every year or so.

  • WoodenChair 2 days ago

    Yes, "The Little Kingdom" is a great book (the very slightly updated edition is called "Return to the Little Kingdom" but really just adds a short epilogue), and in my opinion (having read most of them), the best book on early Apple. It gives you more insight into some of the other characters at the early company (Mike Markkula and Mike Scott for example) than some of the other books. Interestingly, the author, Mike Moritz (a lot of Mikes), went on to be a highly successful venture capitalist. So he must've had some keen insights about the tech industry.

    I was a guest on a podcast called "CFO Bookshelf" to discuss that book if you want to hear a discussion of it before making the commitment to dive in:

    https://cfobookshelf.com/return-to-the-little-kingdom/

timonoko a day ago

Keyboard and Display considered superfluous.

When I was released from army 1977, I had learned howto morse and howto maintain AK-47, not much else.

Morse came to good use, it was quite possible to communicate with the computer with single key and led.

__s a day ago

The inflation of a 60$ computer costing 333$ matches up with the cost of custom ergo keyboards

ourmandave 2 days ago

Because you we're lucky to even get 40 columns (80 with an expansion card)?

  • SoftTalker 2 days ago

    80 columns would have been borderline unreadable on the NTSC televisions they were using as displays.

    • flomo a day ago

      Sure. As a practical matter, 90% of Apple II systems were plugged into a monochrome computer monitor which did 80 column text just fine, and the other ~9% had a fancy color monitor. Nobody* used an Apple II with a TV, except in maybe the very early days.

      This is why you should ignore the specs, the Apple didn't really compete with Atari/Commodore/etc. It got crushed by the IBM PC.

    • TMWNN 2 days ago

      Context for others: Standalone computer monitors per se did not exist at retail in 1977. The only choices were either televisions, or displays for security-camera systems such as the Sanyo VM-4209 <https://collections.vam.ac.uk/item/O1461694/sanyo-vm-4209/>. The latter, because of its high quality and perfect size for stacking on top of the Apple next to disk drives <https://www.reddit.com/r/VintageApple/comments/3snpqd/i_foun...>, became the iconic Apple II monitor of the early years until purpose-built computer monitors from Amdek and others, as well as Apple's own Monitor III <https://en.wikipedia.org/wiki/Apple_Monitor_III>, became widely available.

WalterBright 2 days ago

Sometimes you only had upper case because the character generator was a 5*7 grid which wasn't really good enough to display lc.

  • masswerk a day ago

    Maybe interesting in this context: the CRT font renderers for the DEC PDP-1 (notably one of the first commercial machines for which a CRT was available) also rendered at 5 × 7 dots. (As bit patterns for 35 dots fit perfectly into two 18-bit words, with just one bit going unused.)

    The first one, there's preserved source code, I would know of, was by Ben Gurley (the designer of the PDP-1) and Weldon Clark, April 10, 1961. [0] This one featured just upper-case letters, figures and the very minimum of special characters. Notably, the input devices used featured provisions for upper and lower case, hinting at the programmers wanting something workable done quickly. Moreover, the character set may not have defined fully, yet, since (reportedly) Ed Fredkin, then at BBN (and mostly responsible for BBN acquiring the first production prototype), was involved in the final definition.

    Later, the same year, we also see a fully developed font renderer [1], which uses the spare bit in the glyph definition to indicate a drop of the entire font matrix, which enables drawing lower case characters with descenders. (This convention was also used by the later hardware symbol generator Type 33.) With this, all characters can be displayed with perfect readability (with possible compromises only involving lower-case "f".) At the same time, we also see support for the full character set, including representations of non-printable characters for editing.

    [0] https://masswerk.at/spacewar/chargen/pdp1-bbn37-crt.html

    [1] https://masswerk.at/spacewar/chargen/pdp1-1961-codeword-crt....

    [2] Overview with links to related glyph dumps: https://masswerk.at/spacewar/chargen/

  • nine_k 2 days ago

    5x7 is fine to display both lowercase and uppercase. You could even afford some stylistic variations!

    Lack of ROM space for lowercase could be a more plausible explanation for an early computer.

    • WalterBright 2 days ago

      You're probably right. There was a limitation to upper case only, I may misremember the cause.

  • kps a day ago

    This article doesn't mention it explicitly, but the Apple II used the same character generator IC as the TV Typewriter, the Signetics 2513, with its 64-character ROM.

TMWNN 2 days ago

Quoting myself from the prior discussion (with a correction):

-----------------

Woz explains why the original Apple II (1977) doesn't support lowercase letters. That's not surprising, in retrospect; of its major contemporaries, the TRS-80 Model I does not either, despite being developed by a major corporation with substantial resources. (The Commodore PET 2001 does support lowercase, but the keyboard is so terrible that it might as well not.)

He doesn't explain why the Apple II+ (1979)—after the II's market success was proven—doesn't support lowercase letters. Even if software uses graphics mode to display lowercase letters, the II and II+'s keyboard does not have physical/electrical support for detecting shifted letters. Since graphics mode is cumbersome and slow, word processors for the II and II+ typically use reverse video to indicate capital letters, and use another key like Escape as a shift toggle. A popular alternative is the shift-key mod that fattire mentioned, which requires soldering of a wire to one of the paddle ports.

The lack of support is because the company was working on the Apple III (1980), which it expected would quickly obsolete the II series. The III has built-in 80-column text and full lowercase support, at both the character-font and physical-keyboard levels. Apple had incentive to not make the II too attractive.

Neither Woz nor anyone else at Apple expected that a) the III would quickly fail, and b) the II series would remain Apple's bread and butter. Without the III's distraction the II+ would surely have had built-in lowercase software and hardware support, or there would have been another II around 1981 with such. As it were, the III took up so much of Apple's resources that the Apple IIe did not appear until 1983, by which time the IBM PC had surpassed the II series.

  • chuckadams a day ago

    Steve Job's irrational hatred of fans along with his micromanagement of literally every square inch of the case design didn't help the Apple III with its perennial overheating issues. He hastened the demise of the G4 cube for exactly the same reasons. My favorite fact about the III is how it became an officially-sanctioned troubleshooting technique to pick the machine up a few inches off the desk and drop it.

caminanteblanco 18 hours ago

I was not expecting a post by Wozniak. I really do enjoy his writing style.

gbraad a day ago

TL;dr

Wozniak had zero checkings, zero savings. Lack of money to spend time on a risky rewrite. Jobs didn't see a necessity

sillystu04 2 days ago

I wish modern computers considered casing to be purely a matter of style, such that “S” == “s” evaluates to true.

Casing is rarely semantically important, so case sensitivity is widely ignored. But in many circumstances this gets implemented in a haphazard way. For example some sites consider foo@gmail.com and foo@GMAIL.COM to be different email addresses.

Also case sensitivity makes internationalisation awkward, particularly with German.

Sadly it’s too late to change this because doing so would be such a breaking change.

  • alpaca128 2 days ago

    But modern computers do that for the most part. Standard search features default to case insensitivity and Windows & Mac OS have a case insensitive file system. But in other contexts it doesn't make sense considering it's part of the grammar.

    > Also case sensitivity makes internationalisation awkward, particularly with German

    How would it make anything more awkward?

    • zelphirkalt 2 days ago

      Standard search features, for example the start menu of Windows have so severe bugs and shortcomings, that I am not sure they get anything right and would not trust them with lower und upper case either. For example type too early in the start menu, and you might just find nothing, after which it might search online. If you delete the text and type again, suddenly it finds stuff... critical race right there. Or how it finds something, when one types the first 3 letters that match, but when typing the fourth letter, the search result disappears...

      People make searches so broken, it makes me think they have no idea what they are doing. The bigger the company, the worse the search usually is. I think it is maybe a result of non-tech companies hiring silly contractors, who do half-assed work, hoping that people will be so happy, that they get another contract later. Their customers are not the users, but the company that hired them. In case of MS it is probably ... hm, idk, sheer incompetence, I guess.

      • alpaca128 a day ago

        Oh, I fully agree about Windows search. But that's more a case of a company just having no incentives to make their product usable because people buy it anyway.

  • volemo 2 days ago

    I wish [1] our (western) culture was not built on a bicameral [2] alphabet, but c'est la vie: uppercase and lowercase are distinct through out our lives, and thus computers must not break the norm. “MW” != “mW” and “Taylor” != “taylor”.

    [1]: Capitalisation could have been made purely stylistic, the same way italics are — it’s a helpful hint for the reader, but isn’t expected to convey meaning on its own, therefore, generally can be stripped without loss of information.

    [2]: https://wikipedia.org/wiki/Unicase

  • cryptonector a day ago

    > Casing is rarely semantically important

    Yet you're using it yourself.

    I'd rather we stop using capitals than have to make everything case-insensitive. Case-insensitivity is so very annoying to me.

    But you're in luck: case-insensitivity is available in many contexts.

  • kstrauser 2 days ago

    > For example some sites consider foo@gmail.com and foo@GMAIL.COM to be different email addresses.

    I like it when such sites have a bug bounty program, or free offers for new users. Free money is nice.

  • amelius 2 days ago

    I like that I can write:

        Window* window = new Window(...)
    • int_19h 11 hours ago

      Imagine if instead you could write:

        w̲i̲n̲d̲o̲w̲* window = new w̲i̲n̲d̲o̲w̲(...)
      
      (that's the original ALGOL 60 reference language syntax, for the curious; although it didn't have user-defined types, so it was keywords that were underlined)

      Then there's the Smalltalk approach:

        aWindow := Window new ...
    • sillystu04 2 days ago

      You probably don’t like this:

        String string = “String.com”;
      • Dwedit a day ago

        I can't imagine that working in any programming language, given the smart quotes.

      • amelius 2 days ago

        I don't see the problem to be honest.

    • tedunangst 2 days ago

      You don't need case sensitivity for that.

      • amelius 2 days ago

        Perhaps if you redesigned the language. But case sensitivity makes things a lot clearer.