CFB51 College Football Fan Community
The Power Five => Big Ten => Topic started by: utee94 on April 09, 2025, 11:56:20 AM
-
@betarhoalphadelta (https://www.cfb51.com/index.php?action=profile;u=19) : Split this out from Weird History thread.
-----------------------
Actually from 4/7 but I just saw it on my Facebook feed:
ON THIS DAY | In 1964, IBM unveiled the System/360 line of mainframe computers, its most successful computer system. It was called the "360" because it was meant to address all possible sizes and types of customer with one unified software-compatible architecture.
(https://i.imgur.com/FJARkcY.png)
-
....and it's still smaller than my Google Pixel 7 Pro! Amazing!
srsly.....why are phones so big now?
-
Just curious. What exactly does IBM do these days ?
-
srsly.....why are phones so big now?
Because for a lot of people, they are pretty much their primary compute/connectivity device outside of something used purely for work.
I can't stand using my phone (Pixel 5) for much, because the screen isn't big enough. I'd personally love to have a Pixel Pro Fold, to have that level of screen size.
-
Just curious. What exactly does IBM do these days ?
AI and software stuff, I think.
-
AI and software stuff, I think.
They're still QUITE a player in the hardware game. Not just AI/software. It's all enterprise/datacenter stuff though, so not something a typical consumer would ever see.
I don't want to get into it more, and assume our other resident EE doesn't want to either, because their business is interrelated to both of ours...
-
They're still QUITE a player in the hardware game. Not just AI/software. It's all enterprise/datacenter stuff though, so not something a typical consumer would ever see.
I don't want to get into it more, and assume our other resident EE doesn't want to either, because their business is interrelated to both of ours...
Data centers and servers?
-
Speaking of history and IBM.... I remember a time before the term PC was common, and the machines people had that would later be called PCs were called "IBM-compatible." You might have an IBM desktop, or you might have some other brand which was IBM-compatible (we had a Vendex). Near as I can tell, it just meant they operated on DOS, so I guess Microsoft was already behind the curtain. I guess Apple was a thing at that time....I don't really know. I don't know anyone who had an Apple Macintosh in the late 80's, if they were around. But it seems like the term "IBM-compatible" would be meant to distinguish those machines from something else.
Fast forward a decade and the term had become PCs, which referred to Windows-based machines, and again, not-Apples. The term PC is odd to me, because it stands for Personal Computer, which Apples also are. And of course, nobody calls them Apples anymore, they're referred to as Macs.
-
IBM compatible was a big deal in the early 80's best I can tell. My understanding is that IBM's were seen as more business friendly, whereas other competitors (Apple II and others) were seen more for education or gaming. Compaq made the first clone, they had to reverse engineer the BIOS, after that the floodgates busted open and many clones were out there.
Microsoft played the long game, set themselves up for long term winning with DOS and some pretty tough business practices.
-
Speaking of history and IBM.... I remember a time before the term PC was common, and the machines people had that would later be called PCs were called "IBM-compatible." You might have an IBM desktop, or you might have some other brand which was IBM-compatible (we had a Vendex). Near as I can tell, it just meant they operated on DOS, so I guess Microsoft was already behind the curtain. I guess Apple was a thing at that time....I don't really know. I don't know anyone who had an Apple Macintosh in the late 80's, if they were around. But it seems like the term "IBM-compatible" would be meant to distinguish those machines from something else.
Fast forward a decade and the term had become PCs, which referred to Windows-based machines, and again, not-Apples. The term PC is odd to me, because it stands for Personal Computer, which Apples also are. And of course, nobody calls them Apples anymore, they're referred to as Macs.
The Wiki article on this is pretty solid: https://en.wikipedia.org/wiki/IBM_PC_compatible
There was a lot more to it... Much bigger proliferation of computers, all with proprietary hardware architectures. Proprietary architectures meant proprietary software. Then IBM came out with the PC with the Intel 8088 processor, and everyone copied the architecture so that they could all use the same software. Over that decade all of the other proprietary brands with the exception of Apple basically died off, Microsoft became king of the OS market, and Intel became king of the processor market.
And as the article points out, the more modern term became "Wintel", for a Windows OS computer based on the Intel (or AMD) x86 architecture, because "IBM-compatible" no longer carried any weight.
-
What I recall from my elementary school years--and how I still think about it--is that the "compatible" in IBM-compatible meant that my friends with IBM-Cs could run their games on my IBM-C and vice versa. Whereas my friends with Commodore 64's could not boot their games over at the houses of us kids with IBM-Cs, and vice versa.
That was a source of consternation for me, because I thought the Commodore 64 had way cooler games, and I wanted one, and I was so pissed when we finally got a computer one day and my dad had come home with an IBM-compatible. It was because a guy he worked with was a techie for his time, and he told my dad the Commodore 64 was basically junk, and he recommended something that could run WordPerfect and some other stuff I didn't care about.
I wanted the games, dammit.
But I did learn DOS on that old Vendex, which I was proud of until Windows hid DOS from the minds of the public and eventually did not run on top of DOS at all, so nobody cared about my cool DOS skills anymore.
I still maintain Microsoft propagated Windows so hard to make everybody forget about DOS 6.0, which as far as I could tell was basically a virus Microsoft decided to release under the guise of "operating system."
-
The good news if you've learned DOS is that it makes it a lot easier to learn Linux as you have already lived in the command line world.
And all the coolest geeks run Linux these days.
-
I run UNIX variants that are NOT Linux because screw those guys.
IBM-compatible, as bwar pointed out, really meant MS-DOS OS plus x86 architecture.
There were a handful of Apple clones as well back in the early days. Franklin was one of them and my best friend owned one. Apple sued the bejeezus out of them and forced them all out of business.
Oh and IBM is also big into commercial enterprise consulting services. Something like 30% of their revenue comes from that. But bwar is correct in that I can't talk much more about them.
-
The good news if you've learned DOS is that it makes it a lot easier to learn Linux as you have already lived in the command line world.
And all the coolest geeks run Linux these days.
I started loading my old laptops with Linux years ago, but I'm not really a cool geek. The distros of at least the last 15 years (the time span that I first messed with Linux) are very visually similar to the Windows GUI and everything seems kinda dumbed down and automatically "easy," even for somebody like me who had never been on a Linux OS before. I never know how to do do something on Linux, but yet I'm almost always right on the first guess for where to find something.
I never got into the command line stuff and don't know any Linux commands. Never use the command line, tbh.
On the ease-of-use thing, that's as opposed to me trying my hand at Macs, which are reputed to be super-easy and marketed as such......"everything just works," Mac users would always tell me. Coming from a lifetime of Windows, I find Macs un-intuitive and I struggle with them, particularly the filing system. In my experience, Linux is the OS that Macs are purported to be.
There used to be an OU fan here on the board named CrimsonGaloot who helped me get going with Linux, and taught me how to load it side-by-side with Windows and dual-boot if I chose. He's not around anymore, and I no longer have his email address, and that's a shame because he was a good guy and good tech support.
-
Anyone who is even vaguely interested in the late 70s/early 80s personal computer revolution, should watch a series from AMC called "Halt and Catch Fire." It starred one of my favorite actors, Lee Pace, and it was just really well done IMO. It was strongly nostalgic to me for some obvious reasons.
-
I run UNIX variants that are NOT Linux because screw those guys.
IBM-compatible, as bwar pointed out, really meant MS-DOS OS plus x86 architecture.
There were a handful of Apple clones as well back in the early days. Franklin was one of them and my best friend owned one. Apple sued the bejeezus out of them and forced them all out of business.
Oh and IBM is also big into commercial enterprise consulting services. Something like 30% of their revenue comes from that. But bwar is correct in that I can't talk much more about them.
Isn't Linux open source? Who are we "screw-you"ing and what did they do?
I don't know what x86 architecture means, so that's lost on me.
Apple is nothing if not consistent. Proprietary to a fault, even when it screws the customer.
Now I'm super-interested in what it is about IBM y'all can't talk about. Sounds shady. Possibly dangerous and illegal. Are y'all spies? Am I gonna read in the papers one day about two rogue agents posed as electrical engineers who took down the human trafficking ring posing as business consulting services?
-
Anyone who is even vaguely interested in the late 70s/early 80s personal computer revolution, should watch a series from AMC called "Halt and Catch Fire." It starred one of my favorite actors, Lee Pace, and it was just really well done IMO. It was strongly nostalgic to me for some obvious reasons.
I watched it when it was on. Highly enjoyable. Its setting really pre-dates my computer awareness since I believe the show starts set in either the late 70's or very early 80's and I didn't know anything about computers until at least 1987, and even then, I knew nothing technical. Other than I learned some DOS commands, I guess. That's not really technical....that's just having to learn how to boot up my games and stupid word processor to do my homework.
Nevertheless, it was nostalgic for me too, and I couldn't exactly tell you why. Maybe just anything about the 80's. But it did reference a few major things here and there I would've been aware of in the computer world, even as young and clueless as I was.
Also........screw Donna. I can't even remember what she did in s4, but the way she treated Bos was crap, and I haven't forgiven her for it.
-
Linux guys just think they're the rock stars of the UNIX world. They sniff their own farts. Screw 'em!
x86 is a term referencing Intel's 16-bit CPU architecture. It started with the 8086 and then moved on the 80186, 286, 386, and so on. Hence the generic term x86.
Apple do what Apple do, same as it ever was.
I'd tell you but then I'd have to ki.... mmmm.... nevermind.
-
I started loading my old laptops with Linux years ago, but I'm not really a cool geek. The distros of at least the last 15 years (the time span that I first messed with Linux) are very visually similar to the Windows GUI and everything seems kinda dumbed down and automatically "easy," even for somebody like me who had never been on a Linux OS before. I never know how to do do something on Linux, but yet I'm almost always right on the first guess for where to find something.
I never got into the command line stuff and don't know any Linux commands. Never use the command line, tbh.
Yeah, Linux is a LOT easier to use than it was 20 years ago. Honestly in many ways it can be the type of thing that if you want someone to have a PC that's largely just a dumb terminal / web browser, I'd rather set up a Linux system than Windows because they don't know enough to screw it up, and there's not a large group out there trying to write Linux viruses. It's like a Chromebook, but completely open and not locked into the Google world.
I recall that when I moved from SoCal to Atlanta in 2005, I embarked on a project to get deeper into Linux as a learning experience. So I repurposed a desktop tower PC that I had, using a TV capture card and a graphics card, and built my own DVR based upon the MythTV software that ran on top of Linux at the time.
It was all pretty cool and not that hard--except that Linux didn't have a built-in driver for the ATi graphics card so I had to compile them myself from source. Made things harder than they needed to be, but it worked!
The cool thing about having an open-source DVR is that all the things that will piss off content providers if you are a cable/satellite/streaming service if you build them into your DVR software, are now completely available. The MythTV DVR software itself once a recording was complete would go in and analyze the recording to find the commercials and just cut them right out of the file. So you didn't even have to skip or fast forward through commercials. They were just--poof!--gone.
But now Linux is pretty simple to do most everything using the GUI. But if you know how to get under the hood with command line, you get WAY more control than that.
-
MAC-OS has been UNIX-based since 2007. They do allow access to the CLI but I don't think they allow as much control as a normal UNIX OS would have. But that's just what I've heard, I've had no experience doing any CLI operations on an Apple product since the days of Apple ProDOS in 1982 or 83 maybe.
-
What I recall from my elementary school years--and how I still think about it--is that the "compatible" in IBM-compatible meant that my friends with IBM-Cs could run their games on my IBM-C and vice versa. Whereas my friends with Commodore 64's could not boot their games over at the houses of us kids with IBM-Cs, and vice versa.
That was a source of consternation for me, because I thought the Commodore 64 had way cooler games, and I wanted one, and I was so pissed when we finally got a computer one day and my dad had come home with an IBM-compatible. It was because a guy he worked with was a techie for his time, and he told my dad the Commodore 64 was basically junk, and he recommended something that could run WordPerfect and some other stuff I didn't care about.
I wanted the games, dammit.
But I did learn DOS on that old Vendex, which I was proud of until Windows hid DOS from the minds of the public and eventually did not run on top of DOS at all, so nobody cared about my cool DOS skills anymore.
I still maintain Microsoft propagated Windows so hard to make everybody forget about DOS 6.0, which as far as I could tell was basically a virus Microsoft decided to release under the guise of "operating system."
I had a C64, in fact I still have it. I got it about 1985 or ‘86. Great little computer. Load *.*, 8,1
DOS was cool way back when. Still freaks some younger ppl out when I go to the command line for something vague.
Used to be a fun one. Netsend in the command line, then they’re username, then the message. We did one one time that said all data will be deleted. Press enter. We could hear the cussing across the room.
-
I watched and enjoyed the 1st and 2nd season of Halt. It kinda went off the rails after that.
-
I didn't really like the time jump, not sure what season that was. But I got used to it, and still enjoyed it up to the end.
-
I had a C64, in fact I still have it. I got it about 1985 or ‘86. Great little computer. Load *.*, 8,1
DOS was cool way back when. Still freaks some younger ppl out when I go to the command line for something vague.
Used to be a fun one. Netsend in the command line, then they’re username, then the message. We did one one time that said all data will be deleted. Press enter. We could hear the cussing across the room.
Load *.*, 8, 1 lolz, I remember that. Still don't know what it means.
Me and my cousin, circa about 8th grade or so, erased the hard drive from DOS on my uncle's computer because....well....I think we just wanted to see if it would work. It did. My uncle didn't have the heart to do anything to me, but I heard my cousin got Vietnam-vet-cussed after I left. My dad thought it was hilarious, though he warned me not to do it to our computer or he'd end me. Speaking of history.....we both probably almost were history on account of that.
-
Linux guys just think they're the rock stars of the UNIX world. They sniff their own farts. Screw 'em!
I thought Linux was a separate language from UNIX.
-
I thought Linux was a separate language from UNIX.
Linux is a variant of UNIX. It is open-sourced and not proprietary like Solaris (Sun Micro), or AIX (IBM), or HP-UX (HP), or others.
-
Yeah, Linux is a LOT easier to use than it was 20 years ago. Honestly in many ways it can be the type of thing that if you want someone to have a PC that's largely just a dumb terminal / web browser, I'd rather set up a Linux system than Windows because they don't know enough to screw it up, and there's not a large group out there trying to write Linux viruses. It's like a Chromebook, but completely open and not locked into the Google world.
But now Linux is pretty simple to do most everything using the GUI. But if you know how to get under the hood with command line, you get WAY more control than that.
Yeah, and it's real lightweight too, which is why I stick it on my old PC's once the hardware is obsolete. I've had trouble with Linux natively recognizing the touchpad on Dell laptops, but it's gotten easier and easier over the years to load the driver. I want to say the last time I installed it, it recognized the problem and offered to get the driver for me. It's been a couple years, I might be misremembering. I like the Mint distro. Ubuntu is probably the "flagship"--if Linux has such a thing--and I hear it can do more stuff, but I think Mint is the easiest for somebody coming from the Windows world, so it's mostly what I've used. It also has native apps that I really enjoy. Rhythmbox, for example, is a fantastic music manager for my tastes. It's everything iTunes used to be before Apple lost its damn mind in 2011 starting with iTunes version 11. I despise iTunes now as a music manager, but Rhythmbox on Linux? Wonderful.
I wouldn't mind learning Linux, I'm just not motivated to do it. There's no work benefit for me, and I've never come across something in personal use that made me wish I knew how to utilize the command line. Maybe if I knew more about what's possible I'd be more interested. I pretty much stay away from it unless I'm trying to fix some little issue and I've looked up how to do something.
I have to use the command prompt in Windows sometimes for Python stuff, for work and when I was in school. But I mostly don't use that either.
When I gather the money, I want to build a new desktop with 4 hard drive spaces. I don't like dual-boot, partitioned-drive setups because I've found it causes some glitches on the Windows side. I'm gonna install Windows on one drive, Linux on another, and my files will be located on a third so I don't have to reload everything every few years when a Linux distro stops being supported and I have to install a new version. I'll leave the 4th blank, but eventually I'd aim to make myself a Hackintosh. For as much as I dislike Macs, it does have music software I like that is simply not available on any other platform.
-
Apropos...
If you're bad with computers, it doesn't mean you're just not good with tech. You're probably just stupid.
https://scitechdaily.com/new-study-a-lack-of-intelligence-not-training-may-be-why-people-struggle-with-computers/
-
I never know how to do do something on Linux, but yet I'm almost always right on the first guess for where to find something.
.
What does Lucy's brother in Peanuts have to do with being a cool geek??? So whimsical,sheesh
-
Apropos...
If you're bad with computers, it doesn't mean you're just not good with tech. You're probably just stupid.
https://scitechdaily.com/new-study-a-lack-of-intelligence-not-training-may-be-why-people-struggle-with-computers/
I want you to know I take that personally and I am not amused :96:
-
What does Lucy's brother in Peanuts have to do with being a cool geek??? So whimsical,sheesh
I think you mean Linus. I like him too. He has a security blanket, my folks tell me I did something similar as a toddler. That makes me a cool geek. That's the connection.
-
I wouldn't mind learning Linux, I'm just not motivated to do it. There's no work benefit for me, and I've never come across something in personal use that made me wish I knew how to utilize the command line. Maybe if I knew more about what's possible I'd be more interested. I pretty much stay away from it unless I'm trying to fix some little issue and I've looked up how to do something.
I have to use the command prompt in Windows sometimes for Python stuff, for work and when I was in school. But I mostly don't use that either.
For me it's mainly just that it's free, I know how to use it, and some of the "odd" use cases I have can be done easily. I keep one mini PC that is essentially my home "server". That was the thing I was complaining out a few weeks ago because the hardware had died and I needed to buy a new one, and ran into some weird technical issues trying to load Linux on it because of an old corrupted USB stick.
For example, I was playing around for a while with something called RaspberryPints. It was a Raspberry Pi based web server that would allow you to broadcast your beer tap list to a laptop, tablet, or phone. The Raspberry Pi was... Unstable. It would get corrupted from time to time, requiring a complete reinstall. But because I had a Linux server sitting there connected to my router, I was able to simply install the apache web server and the RaspberryPints web content onto it, and suddenly I had a much more stable web server for my tap list.
Could I do that in Windows? Probably. But the sorts of projects that people create for this sort of thing are usually Linux geeks like me, so being native to Linux just makes life easier for me.
When I gather the money, I want to build a new desktop with 4 hard drive spaces. I don't like dual-boot, partitioned-drive setups because I've found it causes some glitches on the Windows side. I'm gonna install Windows on one drive, Linux on another, and my files will be located on a third so I don't have to reload everything every few years when a Linux distro stops being supported and I have to install a new version. I'll leave the 4th blank, but eventually I'd aim to make myself a Hackintosh. For as much as I dislike Macs, it does have music software I like that is simply not available on any other platform.
This portion would be a good use for a NAS for backup, instead.
And now that I'm working through this miniPC build, I'm going through and finally cleaning up a bunch of disparate personal file sources (partly pictures, but other stuff too). I want to get it fully organized for backup purposes.
That miniPC will have local storage, will have a USB HDD dock, and then I also have a NAS. With Linux, it's trivially easy to set up a cron job (regularly scheduled action) that will use a command line based rsync command that will make sure that all copies of these personal folders remained synchronized so as I add new files or pictures, I won't have to manually maintain both the local but also the backup copies. And then I'm looking at using cloud storage [final safety location for my personal files in case my house burns down lol], and many of the cloud backup storage providers have Linux clients for synchronizing with their service too.
If your files are important, I'd rather they be stored both on your primary storage / desktop (one copy) and on external storage or NAS (second copy) and on an offsite location (third copy). It's the 3-2-1 rule for backup.
-
I want you to know I take that personally and I am not amused :96:
Believe me, if you're dual-booting PCs with Windows and Linux and talking about setting up a Hackintosh...
...you're good with computers.
The fact that you're not a command-line expert doesn't change that.
-
This portion would be a good use for a NAS for backup, instead.
And now that I'm working through this miniPC build, I'm going through and finally cleaning up a bunch of disparate personal file sources (partly pictures, but other stuff too). I want to get it fully organized for backup purposes.
That miniPC will have local storage, will have a USB HDD dock, and then I also have a NAS. With Linux, it's trivially easy to set up a cron job (regularly scheduled action) that will use a command line based rsync command that will make sure that all copies of these personal folders remained synchronized so as I add new files or pictures, I won't have to manually maintain both the local but also the backup copies. And then I'm looking at using cloud storage [final safety location for my personal files in case my house burns down lol], and many of the cloud backup storage providers have Linux clients for synchronizing with their service too.
If your files are important, I'd rather they be stored both on your primary storage / desktop (one copy) and on external storage or NAS (second copy) and on an offsite location (third copy). It's the 3-2-1 rule for backup.
My plan would not entail the 3rd hard drive being a backup, it'd be the primary file storage....files to be accessed from either the Windows or Linux side, so I don't have one set of files in one environment and another set of files in the other environment, and so I don't have to duplicate all the files on both sides. That way, I boot into whichever OS I want to use, but all my files are there for me. Seems like I should be able to get away with smaller drives for the two I just want to put the OS on.....I don't plan on filling them up with files.
A NAS would still seem to be a good backup. I have a Synology NAS with two drives, but I'd need to replace them with something much bigger. They're old drives we found in my wife's closet from who-knows how long ago, and they're only like 500 GB, I think. They're set up in RAID 1, or whatever array type mirrors the drives in case of failure. I only keep my music on them, because I don't think all our videos, photos, etc. would fit. And the Synology software plays nice with the Plex app on the living room Firestick, so I can access my music collection in the living room through the nice studio speakers I have hooked up to the TV.
I don't know about paying for off-site backup. I see the wisdom in it, but it depends on how much it is, and also I don't trust other companies with my data. Google, Facebook, and Microsoft already have more on me that I'm comfortable with. I have an external backup drive that I update to every couple months, and it stays in a fireproof/waterproof safe. I've not yet felt the urge to do off-site cloud storage.
Have you ever messed with Wine in Linux? I never have. I used to hear it was glitchy, but I've also heard in the last few years its gotten better. Depending on the capability and reliability there, maybe I wouldn't even need a Windows drive.
Also, maybe we should section off these posts to a new tech-nerd thread. I feel like I'm derailing the history thread pretty severely at this point.
-
I split it out.
For the NAS, if it's that old, with drives that old, you might want to replace the NAS, not just the drives.
I've never messed around with WINE. I mean, I like wine, but I've never really tried WINE. For me, I've been happy enough with what I can do in Linux that I haven't cared to do anything with Windows.
-
The NAS is pretty new. Just the drives are old.
If I could use the MS Office suite in Linux, I might not care for Windows at all. I assume the cloud version of 365 would be fine on Linux, but I like the apps on my PC. Can't remember where, but I believe I used the cloud versions before and it was different than the local-based programs. I guess I'd also need to look into how my IDE's work (if they work) on Linux.
Other than that, honestly, I basically just goof off on the interwebz. Can't really think of much I do that has to be on Windows.
-
The NAS is pretty new. Just the drives are old.
If I could use the MS Office suite in Linux, I might not care for Windows at all. I assume the cloud version of 365 would be fine on Linux, but I like the apps on my PC. Can't remember where, but I believe I used the cloud versions before and it was different than the local-based programs. I guess I'd also need to look into how my IDE's work (if they work) on Linux.
Other than that, honestly, I basically just goof off on the interwebz. Can't really think of much I do that has to be on Windows.
Is what you're doing with MS Office a work-related thing? IMHO if not, most of what you want can probably be handled in LibreOffice.
-
Is what you're doing with MS Office a work-related thing? IMHO if not, most of what you want can probably be handled in LibreOffice.
Not usually, as work has to be done on the work laptop.
However, I've used Libre and Apache OpenOffice over the years, and while they're way better than nothing, I have run into functionality problems in both the Word and Excel equivalents. I only briefly messed with the slide presentation software, but it seemed painfully behind PowerPoint, and I have done many personal slide presentations, and plan to do more.
I will admit, OpenOffice is highly impressive for something that's free, and I've always had it on my Linux machines.
-
Yeah, I don't know that I'd base a business around Libre/Open as they're behind MS on functionality, but then again I'm not trying to... The question for me is "are they close enough for what I need?"
-
I think you mean Linus. I like him too. He has a security blanket, my folks tell me I did something similar as a toddler. That makes me a cool geek. That's the connection.
Tiger rag?
-
Load *.*, 8, 1 lolz, I remember that. Still don't know what it means.
Me and my cousin, circa about 8th grade or so, erased the hard drive from DOS on my uncle's computer because....well....I think we just wanted to see if it would work. It did. My uncle didn't have the heart to do anything to me, but I heard my cousin got Vietnam-vet-cussed after I left. My dad thought it was hilarious, though he warned me not to do it to our computer or he'd end me. Speaking of history.....we both probably almost were history on account of that.
Best I recall it means to load the equivalent of the .exe file. 8 means disk drive, and 1 means which one because you could have multiple drives. I never knew what the 2nd and other drives would be good for, and damn we’re they slow and noisy.
-
It’s funny because people are still out there running old C64’s and creating new hardware. You can buy a USB stick that will adapt, there is or was a hard drive. A few enterprising spirits have even surfed the web in some kind of text only slow mode.
I never saw a Mac at any of my friends houses except one guy in HS who was more of a casual friend. It was so much more capable than the pc’s of the day it blew my mind, and it was already 4-5 years old.
Honestly, the early 1980’s was such a great time to be growing up. Computers were just starting to become common, and there was so much choice in the early days. Apple ( II or Mac), C-64 and then later the Amiga, PC compatible, Atari 400, 800, etc.
Bulletin boards, disk drives, dial up modems.
-
I think you mean Linus. I like him too. He has a security blanket, my folks tell me I did something similar as a toddler. That makes me a cool geek. That's the connection.
MDT /s ~???
-
It’s funny because people are still out there running old C64’s and creating new hardware. You can buy a USB stick that will adapt, there is or was a hard drive. A few enterprising spirits have even surfed the web in some kind of text only slow mode.
I never saw a Mac at any of my friends houses except one guy in HS who was more of a casual friend. It was so much more capable than the pc’s of the day it blew my mind, and it was already 4-5 years old.
Honestly, the early 1980’s was such a great time to be growing up. Computers were just starting to become common, and there was so much choice in the early days. Apple ( II or Mac), C-64 and then later the Amiga, PC compatible, Atari 400, 800, etc.
Bulletin boards, disk drives, dial up modems.
Yup. There was a sense of discovery with all the new technology, rather than just the feeling of utility that exists now.
-
Early 80's was great in High School and college
didn't touch a computer in high school and it was 2nd or 3rd year in college until we were forced to go to the library to use a computer to register for classes
that was it
-
We had telephone registration for university classes, which started my freshman year and was considered very high tech.
I don't know when they switched to online registration. I think I still used the phone registration when I went back for grad school in 2001-2003, but I honestly don't remember.
-
Yup. There was a sense of discovery with all the new technology, rather than just the feeling of utility that exists now.
I was always disappointed I never got to play around on any Amiga or Atari computers. It is said that the Amiga was a very good computer for it's time, and it had capabilities that were years ahead of it's competitors with regards to video editing and graphics. There was one expensive add-on called the "Video Toaster" (what a horrible name) that was used in several TV shows and movies to do the SFX well into the 1990's. Commodore/Amiga could have very well given WinTel a run for their money in the early to mid-90's if they hadn't been so poorly managed.
Speaking of poorly managed, it's such a shame that Atari didn't survive the way Apple did. They had a hit with the 2600 and their arcade games and then released flop after flop after that. I only played the 5200 once or twice, the controller was horrendous. It was like they tried to make the most horrible controller ever in the history of controllers. Then the 7800 was better but still not very good.
Atari was kind of a victim of corporate neglect, having been bought and sold so many times it could never keep any momentum. Once their arcade cash cow dried up that was it.
-
We had telephone registration for university classes, which started my freshman year and was considered very high tech.
I don't know when they switched to online registration. I think I still used the phone registration when I went back for grad school in 2001-2003, but I honestly don't remember.
A&M had a similar set-up, it was called the Bonfire System. Ask your wife about it, I bet she'll remember it vividly. You had to go through a phone menu (press 1 for this or that), and there was a companion book that listed all the courses and the schedule. It worked pretty good. You could connect to the university system via modem and see your schedule and grades and all that and it would update. It was frustrating because you couldn't see what class had any space so you had to keep attempting to get into a class but if the time/day was full you had to choose another and then it would mess your other schedule up. It got easier as you got seniority because you got to register earlier.
I can still hear the lady's voice in the intro : "Welcome to the Texas A&M University Student Registration System" or something similar.
I think they changed it my very last or 2nd to last semester (Spring/Fall 2000) it was all online. Worked much better.
-
Commodore/Amiga was competing more in Apple's space than in the PC world. Maybe there was room for two players in that space but I'm not so sure.
One of my theater-kid friends had an Amiga and we used it to do all sorts of video titling and special effects. In our English Lit and Theater classes, any group project that came up, we ended up making a movie out of. So when we read 1984 and Brave New World, about 8 of us got together and made the movie we called "Utopia." When we studied Shakespeare, we did a spoof about Hamlet but the main character was Ronald Reagan. I of course played The Gipper himself. There were a couple of others, my dad actually found the VHS tapes and had them transferred to digital and stored in the Cloud.
We used my dad's VHS camcorder and shot on location all over Austin. Then I used our two VCRs and did the video editing, and I borrowed my church's 8-channel mixer and dubbed all of the background music and voiceovers onto the main copy. Finally we used the Amiga to add video titles and transition effects. It was all pretty professional, it helped that both of my parents were Radio-Television-Film majors at UT and had taught me how to do all this stuff.
And these things were like 30 minutes long, we'd take one class period to see ALL the other groups' presentations, but then our group always got its own entire class period. And our teachers would show the videos to all of their other classes as well. It was really a lot of fun and mixed my theater/music/tech geek aptitudes well.
Anyway... memories...
-
Commodore/Amiga was competing more in Apple's space than in the PC world. Maybe there was room for two players in that space but I'm not so sure.
One of my theater-kid friends had an Amiga and we used it to do all sorts of video titling and special effects. In our English Lit and Theater classes, any group project that came up, we ended up making a movie out of. So when we read 1984 and Brave New World, about 8 of us got together and made the movie we called "Utopia." When we studied Shakespeare, we did a spoof about Hamlet but the main character was Ronald Reagan. I of course played The Gipper himself. There were a couple of others, my dad actually found the VHS tapes and had them transferred to digital and stored in the Cloud.
We used my dad's VHS camcorder and shot on location all over Austin. Then I used our two VCRs and did the video editing, and I borrowed my church's 8-channel mixer and dubbed all of the background music and voiceovers onto the main copy. Finally we used the Amiga to add video titles and transition effects. It was all pretty professional, it helped that both of my parents were Radio-Television-Film majors at UT and had taught me how to do all this stuff.
And these things were like 30 minutes long, we'd take one class period to see ALL the other groups' presentations, but then our group always got its own entire class period. And our teachers would show the videos to all of their other classes as well. It was really a lot of fun and mixed my theater/music/tech geek aptitudes well.
Anyway... memories...
A true Nerd's nerd.
-
Best I recall it means to load the equivalent of the .exe file. 8 means disk drive, and 1 means which one because you could have multiple drives. I never knew what the 2nd and other drives would be good for, and damn we’re they slow and noisy.
My understanding was that the Commodore 64 basically didn't have a hard drive. The disk drive was kind of it. Obviously a motherboard/processor had to be somewhere. I didn't know you could do multiple drives, or how that would work. My friends only had the one floppy disk drive. If you could do any kind of work or hobby that you could save on those machines, we didn't know how. We just loaded games from a floppy, and that was basically it.
The guy I mentioned who worked with my dad is who told me that, and I have no idea if he really knew what he was talking about. He would've been considered a PC guru for his time, I know that much. In an age where most people didn't have computers, and those who did could only do the most basic of tasks with them, he was doing business, tax documents, all kinds of stuff on his home PC. Since I blamed him for us getting an IBM-compatible, I asked him what the hell. He said the Commodore was kind of a piece of trash that didn't even have a hard drive. Whether he was right or he was just an IBM shill, I couldn't say. I was like 8.
-
A true Nerd's nerd.
There is no denying that my nerd credentials are quite strong.
Electrical engineer, computer programmer, high school band and drama and choir and A/V, regularly played Dungeons & Dragons and the Ultima series of computer games, have read every Isaac Asimov, Piers Anthony, and Tolkien book ever published (plus a few hundred more), and dressed as Dr. Who for a 6th grade Halloween party.
I never wore glasses, at least not until I hit 50. That's about the only thing I'm missing.
-
I never saw a Mac at any of my friends houses except one guy in HS who was more of a casual friend. It was so much more capable than the pc’s of the day it blew my mind, and it was already 4-5 years old.
Honestly, the early 1980’s was such a great time to be growing up. Computers were just starting to become common, and there was so much choice in the early days. Apple ( II or Mac), C-64 and then later the Amiga, PC compatible, Atari 400, 800, etc.
Bulletin boards, disk drives, dial up modems.
I never saw a Mac growing up either. My intro to Macs were 1993, 8th grade, the nerd-G&T English class I was in had them. No idea how old or new they were. We did some of our work on them, but for the life of me I can't really remember how we used them or what we did with them. iirc--and I might not be--Windows was a thing by then, and we might have even had it, but I was still booting into DOS and loading Windows from there. And I'd already been dealing with DOS for a few years by then, so a visual interface was all new to me. I recall thinking the Mac interface was way more visual than what I was used to, but pretty cool. Now that I think about it, I think we did some kind of presentations with those things. Like book reports and stuff. Maybe something akin to a forerunner of PowerPoint. One of the kids figured out how to do animations and taught us to make our images move around, and I felt like Marty McFly and I'd just jumped into the future.
I was not what you'd call tech-savvy.....like, I wasn't doing stuff like utee talks about, but I still recall the same sense of wonder you guys talk about. I'm a little younger than utee, maybe you too, but even a few years after y'all, that same ethos was still permeating the kids who had access to computers. I remember my dad telling me that one day nearly everybody would have a computer. It was hard to believe him, and I certainly wouldn't have imagined laptops and tablets, and people doing work or hobbies sitting on their couch.
The summer after my 8th grade year I recall AT&T had a commercial that ran, advertising telecommunication. It featured a voice saying something like "Have you ever attended a meeting......from your vacation bungalow?" and it showed a guy in beach clothes on the deck of what was supposed to be a beach-front vacation property with the beach in the background, talking to severe-looking people in business attire on a screen. Then the voice said "You will." I remember thinking that was awesome, and I thought we were supposed to be able to do that right now. Of course, the internet, such as it was, was dial-up, and most everything I knew about was bulletin boards (I got in major trouble on those) and absolutely nobody was doing anything of the sort. After a while I thought AT&T had lied to me, was full of crap, and I basically forgot about the idea for years. I just noticed during the pandemic when Zoom became so popular, that we've had Skype and Facetime and stuff for years, and I never really noticed. The future came and I'd failed to really notice it.
-
There is no denying that my nerd credentials are quite strong.
Electrical engineer, computer programmer, high school band and drama and choir and A/V, regularly played Dungeons & Dragons and the Ultima series of computer games, have read every Isaac Asimov, Piers Anthony, and Tolkien book ever published (plus a few hundred more), and dressed as Dr. Who for a 6th grade Halloween party.
I never wore glasses, at least not until I hit 50. That's about the only thing I'm missing.
I hesitate to call foul on an honest broker such as yourself, but this time I am really tempted. Piers Anthony has to have written over 100 books, and I am a bit skeptical.
I've read a few of his, my sister liked the Xanth novels when she was younger, and I have a friend who was way into him when we were growing up. For the most part I missed out on him. The books I did read, my impression of him was amazing premises.....nobody came up with cool ideas like him.....but I didn't think much of his writing or plots. That's just me though, obviously tons of people love his works.
-
I hesitate to call foul on an honest broker such as yourself, but this time I am really tempted. Piers Anthony has to have written over 100 books, and I am a bit skeptical.
I've read a few of his, my sister liked the Xanth novels when she was younger, and I have a friend who was way into him when we were growing up. For the most part I missed out on him. The books I did read, my impression of him was amazing premises.....nobody came up with cool ideas like him.....but I didn't think much of his writing or plots. That's just me though, obviously tons of people love his works.
That's fair.
I'm about 99% sure I read everything he published before May 1994. Since then, probably not much. So that included about half of the many many Xanth novels, all of the Incarnations of Immortality, all of the Apprentice Adept, most of the Bio of a Space Tyrant, and plenty of rando additional stuff. But I mean, I've read over a thousand books and possibly double that so it's not inconceivable I would have read of all of his. My own library is about 400 books and that's just a fraction of what I've read in my lifetime.
Anyway I thought PA was an entertaining writer, never really had any issues with his stuff.
-
Ever read any of Terry Pratchett's Discworld novels?
-
My understanding was that the Commodore 64 basically didn't have a hard drive. The disk drive was kind of it. Obviously a motherboard/processor had to be somewhere. I didn't know you could do multiple drives, or how that would work. My friends only had the one floppy disk drive. If you could do any kind of work or hobby that you could save on those machines, we didn't know how. We just loaded games from a floppy, and that was basically it.
The guy I mentioned who worked with my dad is who told me that, and I have no idea if he really knew what he was talking about. He would've been considered a PC guru for his time, I know that much. In an age where most people didn't have computers, and those who did could only do the most basic of tasks with them, he was doing business, tax documents, all kinds of stuff on his home PC. Since I blamed him for us getting an IBM-compatible, I asked him what the hell. He said the Commodore was kind of a piece of trash that didn't even have a hard drive. Whether he was right or he was just an IBM shill, I couldn't say. I was like 8.
Our C64 didn't even have a hard drive. It had a tape drive. It was TONS of fun to wait for the computer to read the cassette tape with the program, sequentially, for @ 20 minutes before it got to the part of the tape that had your program on it, before you could do *anything*.
The C64 did also have a place to load a cartridge into the back, and there were games that were cartridge-based. We had Solar Fox (https://en.wikipedia.org/wiki/Solar_Fox) and Frogger. Those were better because they loaded immediately.
-
My understanding was that the Commodore 64 basically didn't have a hard drive. The disk drive was kind of it. Obviously a motherboard/processor had to be somewhere. I didn't know you could do multiple drives, or how that would work. My friends only had the one floppy disk drive. If you could do any kind of work or hobby that you could save on those machines, we didn't know how. We just loaded games from a floppy, and that was basically it.
The guy I mentioned who worked with my dad is who told me that, and I have no idea if he really knew what he was talking about. He would've been considered a PC guru for his time, I know that much. In an age where most people didn't have computers, and those who did could only do the most basic of tasks with them, he was doing business, tax documents, all kinds of stuff on his home PC. Since I blamed him for us getting an IBM-compatible, I asked him what the hell. He said the Commodore was kind of a piece of trash that didn't even have a hard drive. Whether he was right or he was just an IBM shill, I couldn't say. I was like 8.
Well, no, the C64 did not have any kind of a hard drive, as most other computers did not in that era. Even the IBM Compatible computers of the early 80's did not have a hard drive, as far as I can remember. I don't even think any of the computers in my school had hard drives, and I graduated from HS in '94. I remember the first time I even heard about a hard drive, it was probably in the early 90's and one of our teachers was telling us about it. She said it had a 10 (!) megabyte hard drive, and we all ohh'd and ahhh'd. 10 megabytes? What on earth would you ever need that much storage for? Remember, back in those days a floppy was 520KB, or the still new "hard floppy" was 1.44 MB. Remember, there were no digital video's, photo's, or music in the 80's and 90's. Most games were on the order of 20-30KB. I read recently that the original Super Mario Brothers game for NES was 32KB.
You could save to the C64 disk if you were working on something, I think the disk drive (1541) could be paralleled together. The disk drives actually had their own memory and processor, which is what made them so expensive. Basically a computer feeding a computer. They were damn loud too.
There was a fast load cartridge you could buy (Epyx) that made the DD much faster. Supposedly, some kind of design decision made way back to make the C64 compatible with the VIC-20 software doomed the speed of any hard drive in the future.
-
Our C64 didn't even have a hard drive. It had a tape drive. It was TONS of fun to wait for the computer to read the cassette tape with the program, sequentially, for @ 20 minutes before it got to the part of the tape that had your program on it, before you could do *anything*.
The C64 did also have a place to load a cartridge into the back, and there were games that were cartridge-based. We had Solar Fox (https://en.wikipedia.org/wiki/Solar_Fox) and Frogger. Those were better because they loaded immediately.
For some reason we never owned any cartridge games. But friends had them, and yes Frogger was great.
-
We got our first PC in [I think] 1985, and there was a 10 MB hard drive in that one.
It was actually IBM, not a clone. The IBM PC XT (https://en.wikipedia.org/wiki/IBM_Personal_Computer_XT).
-
Ever read any of Terry Pratchett's Discworld novels?
No. I've heard of them but never read them.
-
We got our first PC in [I think] 1985, and there was a 10 MB hard drive in that one.
It was actually IBM, not a clone. The IBM PC XT (https://en.wikipedia.org/wiki/IBM_Personal_Computer_XT).
Rich PPL !
All joking aside, the C64 was a $200 machine back then. The IBM was probably over $1,000 or more.
-
It's so funny to be from the era I'm from, because we started with DOS type OS, went to early GUI like Win 3.1 and Mac, then to the early smartphone era (Blackberry etc), then to the smart phone and tablet era, and now to whatever is next. A lot of youngsters that come to work for my company (non-degreed of course) have no PC skills at all. It blew my mind that I knew so much more about PC's and Wintel systems. Simple tasks like setting up printers, getting the internet to work, all kinds of misc settings and configurations. Go find somebody under 30 and pull up the command prompt and show them how to use it. They have no idea. Now, you get on the phone or tablet and they know everything, but most companies don't run on phones and tablets.
-
We got our first PC in [I think] 1985, and there was a 10 MB hard drive in that one.
It was actually IBM, not a clone. The IBM PC XT (https://en.wikipedia.org/wiki/IBM_Personal_Computer_XT).
We got our IBM-compatible, I'd say, probably around '88 or '89. But maybe as late as '90. I don't have a clear association in my memory with what grade I was in, so I can't remember for sure. As many times as I saw it boot up I should be able to remember the specs, but I'm just guessing when I say it said 256k memory, but I think that's right. I don't know how much hard drive space. We replaced that around '92 or '93 with a machine that had the slick, new Windows 3.1 pre-installed. That one had 4 MB of RAM and I was hyper-impressed and couldn't imagine what could possibly use that much memory. The hard drive was 200 MB, but my brother-in-law, who is quite a tech-nerd himself, did something he called "stacking" the hard drive, and increased its capacity to 400 MB. To this day I have no idea what that is or what he did. I'm not aware of any procedure to be done on a modern hard drive that can double its storage capacity. I only know he wasn't making it up. Without changing the hard drive, the specs it listed did double.
Unless the bastard knew some kind of way to make it say something different than what it actually was. Which is not out of the question. That idgit didn't even finish high school and has never read an interesting book in his life, but give him a technical manual and he eats that crap up. He learned multiple programming languages and really learned his way around hardware, all self-taught, and eventually helped start the tech office for the Sheriff's Dept. in Baton Rouge where he'd been a cop for years. After he left law enforcement in 2007 he's done networking and programming for municipalities and private companies......and he's completely worthless as far as learning anything from.....dude can't explain anything to save his life, and has no interest. He also doesn't understand what I went back to school for, he heard the word "coding" and thinks I'm a programmer now. I tried to explain to him that my coding ability is mostly limited to data retrieval, manipulation, and ML algorithms, but every time I see him now he shows me something he's working on, which just looks like The Matrix to me, and expects me to understand it. I just smile, nod, tell him good job, and wonder why he doesn't understand I don't know what the hell I'm looking at. And he doesn't even read any cool books. He's the worst kind of tech-nerd. The kind you can't can't learn from and has no other nerd aspects that make him fun to talk to.
-
I bought my first computer (and the first computer anyone in our family owned) in 1982 using my yard work/lawnmowing money, it was a Timex Sinclair 1000. The 1000 meant, it had 1,000 bytes of memory. 1K. And the video memory was shared with system RAM, so if you weren't careful, your instruction set from your program, could overwrite video memory, and then you couldn't see what you were doing.
I bought my second computer (and the second computer anyone in our family owned) probably the next year, it was an Atari 400. It had a whopping 16K of RAM and I also got the external cassette tape drive peripheral.
Then our family finally bought an Apple IIc in 1984 and that's what we all used until I went off to college in 1990. It had a built in 5.25" floppy drive and we got an external one as well. And we had a daisy wheel printer, so no crummy looking dot matrix papers for US! Which was good because my teachers wouldn't accept dot matrix printing. If you didn't have a daisy wheel true resolution printer, your work was expected to be typewritten.
In elementary school we had Apple IIes but in middle school and high school we used PC compatibles for our work. In high school I learned Pascal on the PCs but I learned FORTRAN by telnetting into the UT Taurus dual cyber mainframes from CDC. I was lucky and could use our Apple IIc to dial up and gain entry, but less tech-fortunate friends had to use the teletypes at the school to gain access. At least we didn't have to use punch cards!
In college my friend and roommate had a Mac, and we also used Macs for our Pascal programming class (UTEE wouldn't switch to C as its base computer class for another couple of years, so I had to learn that one on my own). But my junior year I used some of my scholarship stipend and bought a killer PC system with a 486 DX2/66. That thing was SCREAMING fast. Worked great for playing Doom.
-
We got our IBM-compatible, I'd say, probably around '88 or '89. But maybe as late as '90. I don't have a clear association in my memory with what grade I was in, so I can't remember for sure. As many times as I saw it boot up I should be able to remember the specs, but I'm just guessing when I say it said 256k memory, but I think that's right. I don't know how much hard drive space. We replaced that around '92 or '93 with a machine that had the slick, new Windows 3.1 pre-installed. That one had 4 MB of RAM and I was hyper-impressed and couldn't imagine what could possibly use that much memory. The hard drive was 200 MB, but my brother-in-law, who is quite a tech-nerd himself, did something he called "stacking" the hard drive, and increased its capacity to 400 MB. To this day I have no idea what that is or what he did. I'm not aware of any procedure to be done on a modern hard drive that can double its storage capacity. I only know he wasn't making it up. Without changing the hard drive, the specs it listed did double.
Wow, old memories... When you mentioned it I vaguely remembered doing something similar, and it wasn't actually doubling the space but it was compressing files to make the HDD appear larger.
Some googling brought me to the original "Stacker": https://en.wikipedia.org/wiki/Stac_Electronics
I think when I did this, it might have been the Microsoft version, DriveSpace or DoubleSpace: https://en.wikipedia.org/wiki/DriveSpace
I can imagine that this would have seemed like black magic to a non-techie :57:
-
In college my friend and roommate had a Mac, and we also used Macs for our Pascal programming class (UTEE wouldn't switch to C as its base computer class for another couple of years, so I had to learn that one on my own). But my junior year I used some of my scholarship stipend and bought a killer PC system with a 486 DX2/66. That thing was SCREAMING fast. Worked great for playing Doom.
Oh yeah, forgot to mention that 2nd PC I mentioned from 92-93 was the shiny new 486. I played a lot of Wolfenstein on it. As I recall, it was still better to boot into games like that from DOS because it was way faster than waiting for Windows 3.1 to load it.
Windows had a lot of fun, small, pre-installed games back then I wish they'd bring back. One I particularly enjoyed was called Fences, I think. I don't remember exactly when they started including chess, but I had fun getting my butt whipped by that for many years. I now realize the Windows chess program has a pretty low Elo rating, and I can hold my own against it.
But around that time I got a Super Nintendo and that constituted most of my gaming from there on out. The world of PCs became mainly a utilitarian thing for me.
-
Then our family finally bought an Apple IIc in 1984 and that's what we all used until I went off to college in 1990.
Also, your ancient ass is older than I thought you were. Damn.....you gonna make it to next year, or what?
-
hah!
-
Also, your ancient ass is older than I thought you were. Damn.....you gonna make it to next year, or what?
Again...
University of Texas Electrical Engineer 1994.
1994 was my graduation year for undergrad. I spent 4 years in college (no 5 or 6 year plan for me, my scholarships only lasted for 4 years and anything beyond that would have been on my own dime, of which I had very few at the time).
As for your question, it's always a crap shoot at this point.
-
Yes, but for some reason it was stuck in my head for many years that you entered in '94.
Which I admit, is less likely to be a handle than a graduation year, but I know a lot of people who adopted email handles for the first time when we were freshmen in college, who used the current year (freshman year) at the end of their handle name, and by the time graduation rolled around, never changed their account names because JohnDoe97 was already what they were used to and was in all their friends' address books.
I think when I met you years ago I just filed it under "Oh, he must've entered UT back in 1994" and never thought more about it. Now I'm old and you have to tell me things several times or else I won't remember them :-D
-
You're only as old as the women you feel.
-Groucho Marx
(maybe)
-
I too graduated in 1994, but it took me 10 years.
-
What was the deal with Pascal as a computer language back in the day? We learned it in HS in one of my computer classes, and it seems like it was popular to teach. By the time I got in college, they were teaching mostly C. I have a vague memory of taking a class that taught C, but for the life of me I can't remember if it was a class about C or if C was part of the class. About 30 years ago to my reckoning (29 to be more exact) so the memory is a little fuzzy.
Anyways, I think Pascal as a language died out.
-
I was forced to learn and use Pascal and Fortran.
-
It's laughable because somewhat recently I was helping somebody in their early 20's do something on the computer, and I was CTRL-X and CTRL-C and CTRL-P and they were looking at me really weird. I asked them what was wrong, and they were bewildered at my keystroke short cuts. I had to show them how to copy, paste, cut, etc using the keyboard shortcuts. They had no clue. We learned computers with no mouse years ago, so keyboard shortcuts were the norm.
-
What was the deal with Pascal as a computer language back in the day? We learned it in HS in one of my computer classes, and it seems like it was popular to teach. By the time I got in college, they were teaching mostly C. I have a vague memory of taking a class that taught C, but for the life of me I can't remember if it was a class about C or if C was part of the class. About 30 years ago to my reckoning (29 to be more exact) so the memory is a little fuzzy.
Anyways, I think Pascal as a language died out.
I would expect that it was probably popular to teach because it was a much more "modern" programming language than BASIC. I've always viewed teaching programming as largely being separate from the language selected, because fundamentally you're teaching concepts. BASIC didn't have enough on its bones beyond very simple stuff. But Pascal had enough to teach from.
IMHO schools (especially high schools) probably kept using it at the time over C because it was more mature, there were textbooks/resources to use [and re-use each year], etc. It takes a lot more effort for a HS computer class teacher to learn a new language and then develop a teaching curriculum around it rather than just teaching the same stuff they'd used for the last 5+ years. I suspect that's why I learned Pascal in the mid-90s in HS, even though C had probably largely displaced it commercially by then.
Oddly enough I learned C, assembly, and some scripting languages in college, but never C++ or object-oriented programming. To this day I don't really know what OOP is :57:
-
What was the deal with Pascal as a computer language back in the day? We learned it in HS in one of my computer classes, and it seems like it was popular to teach. By the time I got in college, they were teaching mostly C. I have a vague memory of taking a class that taught C, but for the life of me I can't remember if it was a class about C or if C was part of the class. About 30 years ago to my reckoning (29 to be more exact) so the memory is a little fuzzy.
Anyways, I think Pascal as a language died out.
My senior year of high school I signed up for a programming class, which taught Turbo Pascal. It was a disaster, and I learned nothing. I kind of regret not going to school for programming, because the little bit I can do these days, I really enjoy. I might have liked working on larger projects, who knows. But that experience really put me off of it and kind of killed my confidence.
It was a tele-learning class, and we just had a proctor in the room, as our little high school had no teachers qualified to teach any kind of programming. There was a terminal and the instructor, a professor at Northwestern State, had the ability to share his screen with us and the other 5 schools online with us. We could hear him and had mics to ask questions, but we couldn't see him, and he was Asian and extremely hard to understand, so it was already less than ideal.
The biggest issue was that class met every day, but our high school had switched that year to an alternating schedule they called A and B days. Instead of 1st period through 7th period every day, A Days were longer classes of 1st, 3rd, and 5th periods, B Days were longer classes of 2nd, 4th, and 6th periods, and both days had 7th period for the same amount of time as always. Since that class was not taught during 7th period, we only got every other day's worth of class with the instructor. We were literally missing half of our classes. We were all doing horribly, not understanding anything, and when we complained to the Principle, he sat in on a class and said he didn't see the problem, it seemed like we could hear the instructor and see his screen just fine. He completely missed the point about "Yeah, but what about all the classes he's teaching when we're not here?" In retrospect, he probably understood, but the decision had likely been made at the Parish School Board level and there wasn't anything he could do about it.
I'm not saying that studying overtime and really kicking it into the highest possible gear couldn't have overcome that. But I am saying most of us weren't the type to learn programming on our own at 17 years old, and we all basically got sympathy-D's. We earned F's, undoubtedly. I learned nothing about Turbo Pascal because I was lost by the second week of class.
-
I've talked about this before, but one of my HS classes was using BASIC on the Apple IIgs.
By the first few days of class I realized it was a joke. I.e. the teacher would teach us what a "FOR" loop was, and then we'd have 2 1/2 weeks to "practice" it before he'd move on to the next concept. I knew I was going to be bored out of my %^$!#@& mind in that class.
So I decided that first week to start working on my final project for the class. All it had to be was a program that used every one of the concepts taught in the class, at least once, to do "something". It didn't matter what the program did.
I decided to blow that right out. There was a graphics capability on the IIgs within BASIC, so I ended up programming a version of the popular "Tank Wars / Scorched Earth" game, a turn-based game where you have a bunch of tanks on a 2D terrain and you can adjust the angle and power of your shot to try to destroy the other tanks. I included the obvious ballistics parabolic curve shape, included wind (but no other air resistance), multiple strengths of explosive rounds, etc.
I could have gotten an easy 'A' just doing the bare minimum and spending my daily class time reading a book or doing homework for other classes, but I regret nothing.
The second semester was when we started doing Pascal, and for that I didn't come up with anything interesting to do so I just coasted to the easy 'A'.
-
I would expect that it was probably popular to teach because it was a much more "modern" programming language than BASIC. I've always viewed teaching programming as largely being separate from the language selected, because fundamentally you're teaching concepts. BASIC didn't have enough on its bones beyond very simple stuff. But Pascal had enough to teach from.
IMHO schools (especially high schools) probably kept using it at the time over C because it was more mature, there were textbooks/resources to use [and re-use each year], etc. It takes a lot more effort for a HS computer class teacher to learn a new language and then develop a teaching curriculum around it rather than just teaching the same stuff they'd used for the last 5+ years. I suspect that's why I learned Pascal in the mid-90s in HS, even though C had probably largely displaced it commercially by then.
Oddly enough I learned C, assembly, and some scripting languages in college, but never C++ or object-oriented programming. To this day I don't really know what OOP is :57:
Yeah this all sounds about right to me.
I learned Pascal in high school (and then took the class but already knew it in college), but I never used it. But, it was widely used for application programming in the 70s and 80s.
I used C and C++ quite a bit in my professional career, and had to teach myself. It wasn't difficult, programming languages all use the same general commands and structures. The syntax obviously differs and some are more rigidly structured than others, but at their core they all must do the same types of things once compiled for the CPU, so they can't really differ all that much.
-
programming languages all use the same general commands and structures. The syntax obviously differs and some are more rigidly structured than others, but at their core they all must do the same types of things once compiled for the CPU, so they can't really differ all that much.
Yep.
The way I think about it is that once you learn to code, 90% of what you do transfers nearly seamlessly to learning a new programming language. You learn the basics of syntax and how things are organized, and then you're off to the races.
Of the remaining 10%, half of that is marveling at how something that was annoyingly convoluted and difficult to implement is just an absolute breeze with the structure of the new language and you don't have to bang your head against the wall doing that any more. And the other half is finding that something that was an absolute breeze in the old language is annoyingly convoluted and difficult to implement so you bang your head against the wall any time you have to do it :57:
-
Oddly enough I learned C, assembly, and some scripting languages in college, but never C++ or object-oriented programming. To this day I don't really know what OOP is :57:
Hey, I might have found a place where I have a nerd-leg up on you!
I'm a bit murkier on the more general concept, but functionally and applicably, regarding data visualization, I do know something about it. I often use a python module called Matplotlib to visualize data, and there's basically two ways to code when using it. You can do Matlab style, which was the language where Matplotlib originally came from, or OOP style. They achieve exactly the same thing, it's just that Python treats the road to get there a little differently.
Of course, Python is really just running C++ under the hood.
-
Yep.
Of the remaining 10%, half of that is marveling at how something that was annoyingly convoluted and difficult to implement is just an absolute breeze with the structure of the new language and you don't have to bang your head against the wall doing that any more. And the other half is finding that something that was an absolute breeze in the old language is annoyingly convoluted and difficult to implement so you bang your head against the wall any time you have to do it :57:
I'm not really a programmer or computer language guru, but that makes sense. I learned Python and R in school, plus SQL, which I know is technically a language, but it's really its own thing imo, and I don't put it in the same category as stuff like Python.
R, in the right environment, is crazy-useful for statistical analysis and I understand why researchers love it.
Python is basically Programming For Dummies Who Don't Understand Programming (at least I think....bear in mind I don't really know any other languages), and I see why it's so popular with data scientists. I've looked at some C++ stuff before and as soon as I realized you have to declare variable types, I b like "Nah, I'm out." Python does that automatically, or rather it interprets data types automatically as it compiles. That's what makes it so easy, so idgit-proof to learn, and so fast to code in.
It also makes it slower, because under the hood it's having to do a bunch of stuff on the back end that you traded off for ease on coding the front-end. That's why the DS world uses popular data modules like Numpy and Pandas which, among other things, streamlines the processing tasks, and effectively cheats the system so that you get the ease of the Python language but with the speed of C++. I mean, they do other useful things too, but that's a lot of it.
At any rate, I see your point, and the programming they taught us to do focused more on learning how to think algorithmically and not really grilling us on syntax. As a result--especially two years removed from school and not really using it much on the job--I often figure out what I want to do, and wind up googling/ChatGPTing some piece of code that I either can't remember or don't know how to do. But I wouldn't even be able to do that if I didn't know how to think through it in the first place. You can't believe how dumb some of ChatGPT's answers are if you just give it a general problem to code for you, even if it works.
-
Oh yeah I had to use Matlab to process output data from various labs throughout college. Just remembered that.
And most scripting languages are just shortform versions of C or Pascal or some other high level language. Primary difference is that scripting languages are interpreted real time and the high level languages are compiled for quicker and more efficient final operation.
-
Yeah, and it's amazing, as we talk about computing power, RAM capacity, storage capacity, just how much all this power...
...allows software developers to be lazy.
Which I completely understand, of course. Shipping a software product that isn't perfectly optimized for performance generates a lot more revenue than not shipping anything. And with widely disparate computing platforms these days, sometimes it's even beneficial to select an inefficient language like Java because you know it's generic / cross-platform and it's interpreted on the fly by the target rather than being compiled specifically for that hardware architecture. Sometimes that inefficiency is a necessary evil.
But it certainly leads to a lot of bloat in the aggregate...
-
Yeah, and it's amazing, as we talk about computing power, RAM capacity, storage capacity, just how much all this power...
...allows software developers to be lazy.
Which I completely understand, of course. Shipping a software product that isn't perfectly optimized for performance generates a lot more revenue than not shipping anything. And with widely disparate computing platforms these days, sometimes it's even beneficial to select an inefficient language like Java because you know it's generic / cross-platform and it's interpreted on the fly by the target rather than being compiled specifically for that hardware architecture. Sometimes that inefficiency is a necessary evil.
But it certainly leads to a lot of bloat in the aggregate...
It used to drive my crazy, having been trained to create the most efficient code possible.
But these days, the compute power and hardware overhead is SO large by comparison, it really doesn't matter all that much. Still, as a matter of principle...
-
Yeah, and it's amazing, as we talk about computing power, RAM capacity, storage capacity, just how much all this power...
...allows software developers to be lazy.
Which I completely understand, of course. Shipping a software product that isn't perfectly optimized for performance generates a lot more revenue than not shipping anything. And with widely disparate computing platforms these days, sometimes it's even beneficial to select an inefficient language like Java because you know it's generic / cross-platform and it's interpreted on the fly by the target rather than being compiled specifically for that hardware architecture. Sometimes that inefficiency is a necessary evil.
But it certainly leads to a lot of bloat in the aggregate...
Ok, you out-nerded me again. That didn't take long, my victory was short-lived.
-
You can't spell geek without EE.
-
It used to drive my crazy, having been trained to create the most efficient code possible.
But these days, the compute power and hardware overhead is SO large by comparison, it really doesn't matter all that much. Still, as a matter of principle...
Pretend for a moment that I actually have a job doing something I went to school for.
It still behooves me, in the little arena I know something about, to be efficient. I'm not saying I'm great at that. Maybe far from it. I'd doubtlessly benefit from working with people who have been doing it for years and can offer tips on efficiency. But if I'm wrangling or visualizing data with 20 million rows and 200 columns or somesuch, it helps to know how the functions operate under the hood, because some methods can save serious time and hardware-usage-hours, especially on an average computer where a lot of that stuff is still done.
And I guess it's not really the same thing, but for building ML models, knowing the math underneath is helpful, because some things are going to bog down the process horribly, in some cases the point of crashing, so it helps to understand what types of solutions should be tried for what types of problems. You can fit a really good model that's unfortunately and needlessly inefficient.
It seems to me there's still a necessity for efficiency in the analytics/ML world. But now go back to the part where I don't do that in the real world and remember that maybe I don't know what I'm talking about.
-
Pretend for a moment that I actually have a job doing something I went to school for.
It still behooves me, in the little arena I know something about, to be efficient. I'm not saying I'm great at that. Maybe far from it. I'd doubtlessly benefit from working with people who have been doing it for years and can offer tips on efficiency. But if I'm wrangling or visualizing data with 20 million rows and 200 columns or somesuch, it helps to know how the functions operate under the hood, because some methods can save serious time and hardware-usage-hours, especially on an average computer where a lot of that stuff is still done.
And I guess it's not really the same thing, but for building ML models, knowing the math underneath is helpful, because some things are going to bog down the process horribly, in some cases the point of crashing, so it helps to understand what types of solutions should be tried for what types of problems. You can fit a really good model that's unfortunately and needlessly inefficient.
It seems to me there's still a necessity for efficiency in the analytics/ML world. But now go back to the part where I don't do that in the real world and remember that maybe I don't know what I'm talking about.
If accuracy and efficiency were the sole goals when bringing a software product to market, then there'd be no issue.
But as bwar alluded to earlier, there's a time-to-market component that can't be ignored. Faster to market means more money over the lifecycle of the product. And also the more time spent on it, the more expensive the final product.
So like anything else, there's a tradeoff between quality, and timeliness. More efficient code, more accurate code, better tested code-- these are all desirable things. But if the tradeoff in time to market and/or production cost is too high, then it's not worth the effort.
And now that compute and storage and most other hardware factors are so large and powerful, the need to create small, efficient, high quality code, is diminished.
-
If accuracy and efficiency were the sole goals when bringing a software product to market, then there'd be no issue.
But as bwar alluded to earlier, there's a time-to-market component that can't be ignored. Faster to market means more money over the lifecycle of the product. And also the more time spent on it, the more expensive the final product.
So like anything else, there's a tradeoff between quality, and timeliness. More efficient code, more accurate code, better tested code-- these are all desirable things. But if the tradeoff in time to market and/or production cost is too high, then it's not worth the effort.
And now that compute and storage and most other hardware factors are so large and powerful, the need to create small, efficient, high quality code, is diminished.
Yep.
And note that this can somewhat be a "bringing a software product to market" statement, that does NOT generalize to all software.
For ML, I think efficiency is extremely important, especially as the data size scales. Mike may know more about this than I do, but if you're testing iterative algorithm changes, and you can bring the time to analyze your data set from 48 hours to 24, you can test twice as much in the same period of time, and learn more than you would with fewer iterations.
Another one that is highly important is in cloud computing. Often when people think of "software", they think of a "program" running on a "computer". But the massive increase in computing power has meant that this isn't really the case any more.
We have virtualization where you might have a very high number of "computers" running on one "computer". What that means is that a single server will be operating multiple "virtual machines" where it's basically creating a software-virtualized "computer" and operating system that you can run an application that--for all it knows--thinks it's being run on a single PC. Once you start doing this, efficiency becomes very important again. Especially if you're paying for the compute resources from a cloud compute provider.
This is then extended by containerization. This is where you take certain functions that perhaps need to be separated from each other, but at the same time you may need hundreds or thousands of them going on at any given time. Think of something like Ticketmaster when they're selling concert tickets. You may have 5,000 individual users logged in searching for tickets, and each search will be a unique experience to that user that includes the amount of time tickets they select are held for payment, their process of going through the order / credit card / etc aspect. "Containers" are used to basically replicate that process many hundreds or thousands of times at once, while also making each one independent of all the others b/c you don't want a bug or issue where suddenly you and I are both buying tickets at the same time and a glitch means I get your front-row tickets but my CC is charged my nosebleed price, and you get my nosebleed seats but your CC gets charged your front-row price. If you're doing one of something, efficiency doesn't matter. If you're doing hundreds or thousands of that same thing at once across your hardware... Efficiency is critical.
So it's not meant to be a blanket statement. It's more a statement that if the [in]efficiency of your code is someone else's problem (i.e. it's on someone else's computer), it's not anywhere near as important to you as a developer as if you're going to be the one paying for the computing power to run it at scale, whether that's on-premises or via a cloud computing service.
-
For ML, I think efficiency is extremely important, especially as the data size scales. Mike may know more about this than I do, but if you're testing iterative algorithm changes, and you can bring the time to analyze your data set from 48 hours to 24, you can test twice as much in the same period of time, and learn more than you would with fewer iterations.
Or from 3 weeks to 3 hours :57:
-
So like anything else, there's a tradeoff between quality, and timeliness. More efficient code, more accurate code, better tested code-- these are all desirable things. But if the tradeoff in time to market and/or production cost is too high, then it's not worth the effort.
please just make software that works
I don't give a damn about efficiency, let the little circle spin another couple seconds, for shit's sake
just make it work! Please!!!
-
(https://i.imgur.com/Mldnvm6.jpeg)
-
Back in the 80's I had some off-brand system. I don't remember what it was called. It had about 4 games native to the console, and that was it. No cartridges, no buying new games. It had some version of pong, a "tennis" game, and a couple other things I don't remember. Basically lines on the screen you could move with controllers. My cousins had an Atari and it was like alien-level AI and Pixar-film-worthy graphics compared to whatever that was I had.
Still had fun on it though :-D
Some years later Nintendo came out and I busied my time with an Italian plumber who kept losing his gf to a fire-breathing dragon.
-
(https://i.imgur.com/Mldnvm6.jpeg)
My guess is about 1983/84 from the price and types of consoles. Atari 5200 was a total disaster, they jumped the shark with that one. The controllers were horrid above all, I don't even think I know anybody that owned one. Coleco had some great games, it's Donkey Kong port was the best. Intellevision was a good console, a few friends had them.
Atari 2600 was pretty much the first home console (I don't count the original Fairchild F or Magnavox Odyssey, they were very few).
-
(https://i.imgur.com/5XYhnzh.jpeg)
-
I can't even imagine attempting to troubleshoot line problems in that rat's nest...
-
Some years later Nintendo came out and I busied my time with an Italian plumber who kept losing his gf to a fire-breathing dragon.
A couple of years ago, Nintendo released their NES Classic Edition (https://www.nintendo.com/en-gb/Misc-/Nintendo-Classic-Mini-Nintendo-Entertainment-System/Nintendo-Classic-Mini-Nintendo-Entertainment-System-1124287.html).
It was a very limited run, and although cheap, VERY hard to find/order. We managed to get one, though.
Great fun these days for us...
-
More than a couple of years ago, my brother really wanted whatever the latest Xbox or PS console was, but his wife was really against it. I guess she thought he'd waste too much time, even though he was by far the primary breadwinner and was always busy trying to keep the homestead and cars maintained and presentable. They subsequently got divorced and about his first purchase after the split, was the latest Xbox. And he still managed to hold down his job and maintain the household and car.
Anyway, back then, I bought him one of these for Christmas, and we actually played it a decent amount when hanging out. Our kids loved it, too.
(https://i.imgur.com/ULX7i7I.jpeg)
-
A couple of years ago, Nintendo released their NES Classic Edition (https://www.nintendo.com/en-gb/Misc-/Nintendo-Classic-Mini-Nintendo-Entertainment-System/Nintendo-Classic-Mini-Nintendo-Entertainment-System-1124287.html).
It was a very limited run, and although cheap, VERY hard to find/order. We managed to get one, though.
Great fun these days for us...
People online (who are always right) indicate that Nintendo is extremely litigious and will go hard after anyone or anything to do with emulators, which pisses off the old-school NES fans because Nintendo also refuses to issue their old platforms as demand dictates.
I think my NES "most frequent" award had to be Mike Tyson's Punchout and Contra. Never could beat Mike himself. Contra, on the other hand, I was quite good at and could get through sometimes even without the famous Konomi cheat code for extra lives. I really liked the Super Mario trilogy for the old NES as well. Even though SM2 was kinda weird and the princess had such insane athletic ability that I lost sympathy for her getting captured in the other games.
-
I was also fascinated by the weird "minus world" in the original Super Mario. To this day, I'd like an explanation for that.
An old elementary school friend I used to borrow games from and was a major game-nerd now works for Nintendo America as a game designer/software developer. He doesn't know what it was about either.
-
....the famous Konomi cheat code....
up up down down left right left right B A B A Start
since this is the nerd thread :)
-
And the funniest thing about Nintendo right now is that the company president's name is...
Doug Bowser
-
People online (who are always right) indicate that Nintendo is extremely litigious and will go hard after anyone or anything to do with emulators, which pisses off the old-school NES fans because Nintendo also refuses to issue their old platforms as demand dictates.
I think my NES "most frequent" award had to be Mike Tyson's Punchout and Contra. Never could beat Mike himself. Contra, on the other hand, I was quite good at and could get through sometimes even without the famous Konomi cheat code for extra lives. I really liked the Super Mario trilogy for the old NES as well. Even though SM2 was kinda weird and the princess had such insane athletic ability that I lost sympathy for her getting captured in the other games.
(https://i.imgur.com/37mr2XW.png)
-
This thread brings back a lot of nostalgia for me. I came of age right during the video game revolution, had an Atari 2600 in elementary school, NES by about 6th/7th grade. Commodore 64 in elementary school. I am especially fond of the NES, which really changed how home video games were perceived. Before NES, the games were just basically all about high scores and such. There wasn't really much to do, other than shoot/drive/high score. Take one of the best A2600 games, Pitfall!, you just ran one way or the other, got treasure, and kept going. It had a 20 minute timer. No real music, just a few SFX. I can still remember the very first time I played the OG Super Mario on NES. It had all the little hidden things, so many places to explore, so many levels. We literally played it for days, weeks, and months. Zelda was similar.
A lot of people thought Atari messed up because their consoles sucked, but when I looked back I realize that it was really the games that sucked. They never got past the "Arcade" model.
We also had a Sega Genesis, which was excellent as well.
-
We had the Atari 2600, and my Atari 400 computer could also be used as a gaming console, it had ROM cartridges as well. But by the time the NES was getting popular I was moving past my gaming console phase and into girls and cars. A college roommate had a Sega Genesis that I only ever played Mortal Kombat on, but even that was pretty sparing. For a brief time in 1993 or 94 I was playing some Doom on my x86 PC system but that was also pretty short-lived.
In short, I'm not much of a gamer, and never had been.
Except for standup Galaga at the arcade. I was a wizard at that.
-
Except for standup Galaga at the arcade. I was a wizard at that.
Local Pizza Hut had Galaga and Pole Position. I spent a lot of time begging my mom for quarters, not nearly enough were forthcoming for my liking.
-
Local Pizza Hut had Galaga and Pole Position. I spent a lot of time begging my mom for quarters, not nearly enough were forthcoming for my liking.
Difference in eras, I'm guessing.
Now when we go to Pizza Port Brewing Co with the kids, I'm happy to give them however many quarters they want so my wife and I can drink our beer in peace :57:
-
When our kids were little, my friend Bald Greg and I would take them to Pinthouse Pizza (brewpub in Austin) and send them over to the little video game section. They had a Ms. Pacman, a Joust, and a Rampage standup video game. We'd tell them the game had already started and they'd think they were playing it, while the demo screen was running. Saved us a bunch of quarters!
-
I quite liked the original Teenage Mutant Ninja Turtles game for NES. It was the only one that ever really made a meaningful distinction between the turtles and their weapons of choice. You could sub out turtles to get through different situations and accomplish different tasks because of the differences. The only "drawback" was the nature of the levels mostly rendered Michaelangelo and Raphael worthless, because their weapons were so short-range, though faster. Usually, you needed Donatello's bo for longer range stuff (but it was slow, that was the tradeoff), or Leonardo's swords which were a good all-around weapon, reasonably fast and with a bit of distance. The levels were interesting, you could go different places, backwards, through doors and into rooms, etc. Sometimes more cerebral than games I was used to, and the graphics were interesting.
The second one they put out seemed to have been more popular, and it was fun in a more mindless, full-on shoot-em-up kind of way, and it introduced the jump kick, which wasn't present in the first game. But all the turtles were basically the same, and the levels were nothing but side-scrolling walk-throughs.
-
Also, Legend of Zelda II
I loved that game. Somehow I never played the first Zelda release.
-
If you remember that era and know anything about the games, there's a YouTube channel called Dorkly that has a lot of funny stuff based on them.
-
Also, Legend of Zelda II
I loved that game. Somehow I never played the first Zelda release.
You just literally listed two of the hardest games ever designed for the NES.
-
Come to think of it, I don't think I ever beat Zelda II.
Now that I think about it, without a friend who had some magazines that gave a lot of tips and mapped some things out, I don't think I would've even gotten close to the end. I remember there came a point in the game where it wasn't obvious any longer what to do or where to go. There was some pretty tricky outside-the-box thinking that had to be done. Well....if you're 8, anyway. Maybe if I'd been a little older I would've figured it out on my own.
-
Oh yeah I played this on our Apple IIc
(https://i.imgur.com/yVRndWs.png)
-
Come to think of it, I don't think I ever beat Zelda II.
Now that I think about it, without a friend who had some magazines that gave a lot of tips and mapped some things out, I don't think I would've even gotten close to the end. I remember there came a point in the game where it wasn't obvious any longer what to do or where to go. There was some pretty tricky outside-the-box thinking that had to be done. Well....if you're 8, anyway. Maybe if I'd been a little older I would've figured it out on my own.
I never beat Zelda 2, and I owned the game. It was freaking hard.
-
(https://i.imgur.com/i7FhF9v.jpeg)
-
(https://i.imgur.com/n6phLTL.jpeg)
-
Oh yeah I played this on our Apple IIc
(https://i.imgur.com/yVRndWs.png)
Whut dis?
This rings a bell, but only as something I've seen somewhere and can't put my finger on.
-
Whut dis?
This rings a bell, but only as something I've seen somewhere and can't put my finger on.
Ultima II on an Apple IIe (or IIc in my case).
-
If I find myself in an arcade today, I look for Ms. Pac-Man.
I grew up on the NES, SNES, then PS2, XBox....idk
I won a tiny trophy for being the best in my daycare in Super Mario Bros. I never owned SMB2, but my neighbor did.
I was a big gamer my whole life, but only when it was too dark to play outside. I'd never pick video games over playing outside/sports.
Since starting Whoa Nellie and having no free time on my hands, I pleateaued at an extra Xbox 1 my brother gave me. The last games I played a lot were Tropico 5, Red Dead 2, and GTA 5.
My favorite games growing up were Super Mario Bros, the NCAA games, and Ken Griffey Jr baseball on the SNES. I edited all the rosters back when that was a challenge to do. The Dodgers were good because they had Piazza's rookie season (1993).
I'd play Rampage and try to do all the levels in Bubble Bobble when I had friends for a sleepover. Those were the days.
Then in high school, we'd endlessly play Goldeneye (4-player) into the wee hours.
Playing my little brother in video games was just an opportunity for him to go nuts, because of the 4-year age difference. We'd play Gauntlet and I'd shoot the food so he coudln't get it, and he'd freak out. I wasn't a very nice brother in that respect. I can still hear the phrase now: "red warrior shot the food!"
-
I definitely beat Zelda II (link). It really was super hard. Peer group playing and watching others helped a ton. I remember drawing out schematics of worlds and paths.. then some nerd showed up w some book or Nintendo power type magazine. We wanted to murder him.
Our drawings were pretty good.
-
We used to draw side-scrolling levels for fun. It was a blast.
-
I definitely beat Zelda II (link). It really was super hard. Peer group playing and watching others helped a ton. I remember drawing out schematics of worlds and paths.. then some nerd showed up w some book or Nintendo power type magazine. We wanted to murder him.
Our drawings were pretty good.
Showoff.
-
A friend of mine had a game called Jill of the Jungle for PC which was fun, from what I remember of playing it at his house. I wanted to get it, but it required a separate math co-processor chip or it wouldn't run, and my PC didn't have one.
Not long after that, all the stuff such a chip did in that era was folded into any regular processor, and those chips ceased to be a thing.
I didn't even know what that meant, and I still don't, really. I just pictured a chip in my friend's computer, doing algebra homework for some reason, while the game ran.
-
I didn't even know what that meant, and I still don't, really.
They were used for floating-point arithmetic (https://en.wikipedia.org/wiki/Floating-point_unit). As you can imagine, computers only deal in 1s and 0s. The processors of the time didn't have dedicated silicon for advanced arithmetic. Which means they had to essentially use software algorithms to do any complex mathematical operation. And software is much slower than dedicated hardware inside the chip for these operations.
So if you didn't have a math coprocessor (FPU), your computer could still do all those calculations, entirely in software. Which wasn't fast enough for those games. If you had the FPU, then it would simply route all those instructions to the FPU, and you had plenty of performance for games.
Now, as you mention, any modern processor will have that function built in.
For modern computers, the analogy would be the graphics cards (or GPUs) needed for games. Essentially the same thing--complex graphics rendering takes an extraordinary amount of computing power to emulate in software. But if you can have dedicated silicon that do the necessary functions in hardware, you can not only get it done quickly but leave the main processor (CPU) to use its resources on other things.
---------------
But... Fun story. In 2001, in my first job out of school, I worked for a company that produced programmable logic (FPGA) chips. These were "general" chips full of logic that you could use to map out complex logic and still have it run "in hardware", which was important for MANY functions if you needed the speed of hardware but whatever you were doing didn't lend itself to actually having the dedicated chips designed and fabricated to do it.
Well, one of the things it had at the time was a software-designed (known as a soft core) embedded processor function. Meaning you could emulate a processor in the logic, and use it to run software as opposed to dedicated complex logic. As it was new, the company had an internal design competition to show ways to use the processor. The group I was in... Designed an FPU to go along with it as it didn't have one natively in the design. And we tested software processing of floating-point arithmetic vs our "coprocessor", and our FPU showed a 100-fold reduction in number of clock cycles to perform calculations compared to emulating it in software.
-
I have no idea how you'd do it, but it would be neat to do a back of the napkin calculation on how much computing power your iPhone or Android has (in your pocket you carry everyday) versus the entire computing power of the world of a certain date.
For example, without knowing all the specifics of the era, I can confidently say that my (aging) iPhone 13 has more computing power than existed in the entire world in 1950. And probably also 1955. 1960-65, I would guess that I would still exceed the entire computing power of the world, but I'm not sure. 1970-75, I'd doubt it. 1980? Probably not.
I guess you'd have to estimate how much power a typical computer from back then had and then estimate how many systems they shipped etc. I read somewhere awhile back that we make more data in one day than existed in the entire history of the world until about 2003. This was a few years ago, so we might make more data in 1 hr than the rest of the world until 200x.
-
I have no idea how you'd do it, but it would be neat to do a back of the napkin calculation on how much computing power your iPhone or Android has (in your pocket you carry everyday) versus the entire computing power of the world of a certain date.
For example, without knowing all the specifics of the era, I can confidently say that my (aging) iPhone 13 has more computing power than existed in the entire world in 1950. And probably also 1955. 1960-65, I would guess that I would still exceed the entire computing power of the world, but I'm not sure. 1970-75, I'd doubt it. 1980? Probably not.
I guess you'd have to estimate how much power a typical computer from back then had and then estimate how many systems they shipped etc. I read somewhere awhile back that we make more data in one day than existed in the entire history of the world until about 2003. This was a few years ago, so we might make more data in 1 hr than the rest of the world until 200x.
This is why bwar's employer exists and persists. :)
-
My wife has an iPhone 15 and says she took 3500 photos on our trip (in addition to the thousands she already had). Just the storage for that many high res photos would be impressive.
-
I find that when I take photos that I'm not in, it becomes super uninteresting. Because you can pretty much pull the same photos from any quick search. I still take em, but they never look as good in person.
-
This is why bwar's employer exists and persists. :)
His company is just a front, so that when the aliens arrive, they'll have all our data, strengths and weaknesses, easily available for a much quicker and cleaner conquest and subjugation.
-
I read somewhere awhile back that we make more data in one day than existed in the entire history of the world until about 2003. This was a few years ago, so we might make more data in 1 hr than the rest of the world until 200x.
This is why bwar's employer exists and persists. :)
Yep. I think the factoid we recently used was that more data was created in the last 3 years than in the previous 3,000.
And that the rate of annual data creation will almost triple between 2023 and 2029.
Of course, a tremendous amount of that data is transitory and not stored long term. But the global installed data storage capacity is projected to double over that time frame as well.
The advances in AI/ML increase the ability to extract value from stored data as well, so it should be accretive to the existing projections.
-
Question for some of you more tech-saavy types. What is the deal lately where they will post clips from some TV show or movie and have the screen inverted or wavy lines or something running through the picture constantly? Is this some sort of AI trickery work-around to keep the copyright violations at a minimum or what exactly is going on?