CFB51 College Football Fan Community
The Power Five => Big Ten => Topic started by: utee94 on April 09, 2025, 11:56:20 AM
-
@betarhoalphadelta (https://www.cfb51.com/index.php?action=profile;u=19) : Split this out from Weird History thread.
-----------------------
Actually from 4/7 but I just saw it on my Facebook feed:
ON THIS DAY | In 1964, IBM unveiled the System/360 line of mainframe computers, its most successful computer system. It was called the "360" because it was meant to address all possible sizes and types of customer with one unified software-compatible architecture.
(https://i.imgur.com/FJARkcY.png)
-
....and it's still smaller than my Google Pixel 7 Pro! Amazing!
srsly.....why are phones so big now?
-
Just curious. What exactly does IBM do these days ?
-
srsly.....why are phones so big now?
Because for a lot of people, they are pretty much their primary compute/connectivity device outside of something used purely for work.
I can't stand using my phone (Pixel 5) for much, because the screen isn't big enough. I'd personally love to have a Pixel Pro Fold, to have that level of screen size.
-
Just curious. What exactly does IBM do these days ?
AI and software stuff, I think.
-
AI and software stuff, I think.
They're still QUITE a player in the hardware game. Not just AI/software. It's all enterprise/datacenter stuff though, so not something a typical consumer would ever see.
I don't want to get into it more, and assume our other resident EE doesn't want to either, because their business is interrelated to both of ours...
-
They're still QUITE a player in the hardware game. Not just AI/software. It's all enterprise/datacenter stuff though, so not something a typical consumer would ever see.
I don't want to get into it more, and assume our other resident EE doesn't want to either, because their business is interrelated to both of ours...
Data centers and servers?
-
Speaking of history and IBM.... I remember a time before the term PC was common, and the machines people had that would later be called PCs were called "IBM-compatible." You might have an IBM desktop, or you might have some other brand which was IBM-compatible (we had a Vendex). Near as I can tell, it just meant they operated on DOS, so I guess Microsoft was already behind the curtain. I guess Apple was a thing at that time....I don't really know. I don't know anyone who had an Apple Macintosh in the late 80's, if they were around. But it seems like the term "IBM-compatible" would be meant to distinguish those machines from something else.
Fast forward a decade and the term had become PCs, which referred to Windows-based machines, and again, not-Apples. The term PC is odd to me, because it stands for Personal Computer, which Apples also are. And of course, nobody calls them Apples anymore, they're referred to as Macs.
-
IBM compatible was a big deal in the early 80's best I can tell. My understanding is that IBM's were seen as more business friendly, whereas other competitors (Apple II and others) were seen more for education or gaming. Compaq made the first clone, they had to reverse engineer the BIOS, after that the floodgates busted open and many clones were out there.
Microsoft played the long game, set themselves up for long term winning with DOS and some pretty tough business practices.
-
Speaking of history and IBM.... I remember a time before the term PC was common, and the machines people had that would later be called PCs were called "IBM-compatible." You might have an IBM desktop, or you might have some other brand which was IBM-compatible (we had a Vendex). Near as I can tell, it just meant they operated on DOS, so I guess Microsoft was already behind the curtain. I guess Apple was a thing at that time....I don't really know. I don't know anyone who had an Apple Macintosh in the late 80's, if they were around. But it seems like the term "IBM-compatible" would be meant to distinguish those machines from something else.
Fast forward a decade and the term had become PCs, which referred to Windows-based machines, and again, not-Apples. The term PC is odd to me, because it stands for Personal Computer, which Apples also are. And of course, nobody calls them Apples anymore, they're referred to as Macs.
The Wiki article on this is pretty solid: https://en.wikipedia.org/wiki/IBM_PC_compatible
There was a lot more to it... Much bigger proliferation of computers, all with proprietary hardware architectures. Proprietary architectures meant proprietary software. Then IBM came out with the PC with the Intel 8088 processor, and everyone copied the architecture so that they could all use the same software. Over that decade all of the other proprietary brands with the exception of Apple basically died off, Microsoft became king of the OS market, and Intel became king of the processor market.
And as the article points out, the more modern term became "Wintel", for a Windows OS computer based on the Intel (or AMD) x86 architecture, because "IBM-compatible" no longer carried any weight.
-
What I recall from my elementary school years--and how I still think about it--is that the "compatible" in IBM-compatible meant that my friends with IBM-Cs could run their games on my IBM-C and vice versa. Whereas my friends with Commodore 64's could not boot their games over at the houses of us kids with IBM-Cs, and vice versa.
That was a source of consternation for me, because I thought the Commodore 64 had way cooler games, and I wanted one, and I was so pissed when we finally got a computer one day and my dad had come home with an IBM-compatible. It was because a guy he worked with was a techie for his time, and he told my dad the Commodore 64 was basically junk, and he recommended something that could run WordPerfect and some other stuff I didn't care about.
I wanted the games, dammit.
But I did learn DOS on that old Vendex, which I was proud of until Windows hid DOS from the minds of the public and eventually did not run on top of DOS at all, so nobody cared about my cool DOS skills anymore.
I still maintain Microsoft propagated Windows so hard to make everybody forget about DOS 6.0, which as far as I could tell was basically a virus Microsoft decided to release under the guise of "operating system."
-
The good news if you've learned DOS is that it makes it a lot easier to learn Linux as you have already lived in the command line world.
And all the coolest geeks run Linux these days.
-
I run UNIX variants that are NOT Linux because screw those guys.
IBM-compatible, as bwar pointed out, really meant MS-DOS OS plus x86 architecture.
There were a handful of Apple clones as well back in the early days. Franklin was one of them and my best friend owned one. Apple sued the bejeezus out of them and forced them all out of business.
Oh and IBM is also big into commercial enterprise consulting services. Something like 30% of their revenue comes from that. But bwar is correct in that I can't talk much more about them.
-
The good news if you've learned DOS is that it makes it a lot easier to learn Linux as you have already lived in the command line world.
And all the coolest geeks run Linux these days.
I started loading my old laptops with Linux years ago, but I'm not really a cool geek. The distros of at least the last 15 years (the time span that I first messed with Linux) are very visually similar to the Windows GUI and everything seems kinda dumbed down and automatically "easy," even for somebody like me who had never been on a Linux OS before. I never know how to do do something on Linux, but yet I'm almost always right on the first guess for where to find something.
I never got into the command line stuff and don't know any Linux commands. Never use the command line, tbh.
On the ease-of-use thing, that's as opposed to me trying my hand at Macs, which are reputed to be super-easy and marketed as such......"everything just works," Mac users would always tell me. Coming from a lifetime of Windows, I find Macs un-intuitive and I struggle with them, particularly the filing system. In my experience, Linux is the OS that Macs are purported to be.
There used to be an OU fan here on the board named CrimsonGaloot who helped me get going with Linux, and taught me how to load it side-by-side with Windows and dual-boot if I chose. He's not around anymore, and I no longer have his email address, and that's a shame because he was a good guy and good tech support.
-
Anyone who is even vaguely interested in the late 70s/early 80s personal computer revolution, should watch a series from AMC called "Halt and Catch Fire." It starred one of my favorite actors, Lee Pace, and it was just really well done IMO. It was strongly nostalgic to me for some obvious reasons.
-
I run UNIX variants that are NOT Linux because screw those guys.
IBM-compatible, as bwar pointed out, really meant MS-DOS OS plus x86 architecture.
There were a handful of Apple clones as well back in the early days. Franklin was one of them and my best friend owned one. Apple sued the bejeezus out of them and forced them all out of business.
Oh and IBM is also big into commercial enterprise consulting services. Something like 30% of their revenue comes from that. But bwar is correct in that I can't talk much more about them.
Isn't Linux open source? Who are we "screw-you"ing and what did they do?
I don't know what x86 architecture means, so that's lost on me.
Apple is nothing if not consistent. Proprietary to a fault, even when it screws the customer.
Now I'm super-interested in what it is about IBM y'all can't talk about. Sounds shady. Possibly dangerous and illegal. Are y'all spies? Am I gonna read in the papers one day about two rogue agents posed as electrical engineers who took down the human trafficking ring posing as business consulting services?
-
Anyone who is even vaguely interested in the late 70s/early 80s personal computer revolution, should watch a series from AMC called "Halt and Catch Fire." It starred one of my favorite actors, Lee Pace, and it was just really well done IMO. It was strongly nostalgic to me for some obvious reasons.
I watched it when it was on. Highly enjoyable. Its setting really pre-dates my computer awareness since I believe the show starts set in either the late 70's or very early 80's and I didn't know anything about computers until at least 1987, and even then, I knew nothing technical. Other than I learned some DOS commands, I guess. That's not really technical....that's just having to learn how to boot up my games and stupid word processor to do my homework.
Nevertheless, it was nostalgic for me too, and I couldn't exactly tell you why. Maybe just anything about the 80's. But it did reference a few major things here and there I would've been aware of in the computer world, even as young and clueless as I was.
Also........screw Donna. I can't even remember what she did in s4, but the way she treated Bos was crap, and I haven't forgiven her for it.
-
Linux guys just think they're the rock stars of the UNIX world. They sniff their own farts. Screw 'em!
x86 is a term referencing Intel's 16-bit CPU architecture. It started with the 8086 and then moved on the 80186, 286, 386, and so on. Hence the generic term x86.
Apple do what Apple do, same as it ever was.
I'd tell you but then I'd have to ki.... mmmm.... nevermind.
-
I started loading my old laptops with Linux years ago, but I'm not really a cool geek. The distros of at least the last 15 years (the time span that I first messed with Linux) are very visually similar to the Windows GUI and everything seems kinda dumbed down and automatically "easy," even for somebody like me who had never been on a Linux OS before. I never know how to do do something on Linux, but yet I'm almost always right on the first guess for where to find something.
I never got into the command line stuff and don't know any Linux commands. Never use the command line, tbh.
Yeah, Linux is a LOT easier to use than it was 20 years ago. Honestly in many ways it can be the type of thing that if you want someone to have a PC that's largely just a dumb terminal / web browser, I'd rather set up a Linux system than Windows because they don't know enough to screw it up, and there's not a large group out there trying to write Linux viruses. It's like a Chromebook, but completely open and not locked into the Google world.
I recall that when I moved from SoCal to Atlanta in 2005, I embarked on a project to get deeper into Linux as a learning experience. So I repurposed a desktop tower PC that I had, using a TV capture card and a graphics card, and built my own DVR based upon the MythTV software that ran on top of Linux at the time.
It was all pretty cool and not that hard--except that Linux didn't have a built-in driver for the ATi graphics card so I had to compile them myself from source. Made things harder than they needed to be, but it worked!
The cool thing about having an open-source DVR is that all the things that will piss off content providers if you are a cable/satellite/streaming service if you build them into your DVR software, are now completely available. The MythTV DVR software itself once a recording was complete would go in and analyze the recording to find the commercials and just cut them right out of the file. So you didn't even have to skip or fast forward through commercials. They were just--poof!--gone.
But now Linux is pretty simple to do most everything using the GUI. But if you know how to get under the hood with command line, you get WAY more control than that.
-
MAC-OS has been UNIX-based since 2007. They do allow access to the CLI but I don't think they allow as much control as a normal UNIX OS would have. But that's just what I've heard, I've had no experience doing any CLI operations on an Apple product since the days of Apple ProDOS in 1982 or 83 maybe.
-
What I recall from my elementary school years--and how I still think about it--is that the "compatible" in IBM-compatible meant that my friends with IBM-Cs could run their games on my IBM-C and vice versa. Whereas my friends with Commodore 64's could not boot their games over at the houses of us kids with IBM-Cs, and vice versa.
That was a source of consternation for me, because I thought the Commodore 64 had way cooler games, and I wanted one, and I was so pissed when we finally got a computer one day and my dad had come home with an IBM-compatible. It was because a guy he worked with was a techie for his time, and he told my dad the Commodore 64 was basically junk, and he recommended something that could run WordPerfect and some other stuff I didn't care about.
I wanted the games, dammit.
But I did learn DOS on that old Vendex, which I was proud of until Windows hid DOS from the minds of the public and eventually did not run on top of DOS at all, so nobody cared about my cool DOS skills anymore.
I still maintain Microsoft propagated Windows so hard to make everybody forget about DOS 6.0, which as far as I could tell was basically a virus Microsoft decided to release under the guise of "operating system."
I had a C64, in fact I still have it. I got it about 1985 or ‘86. Great little computer. Load *.*, 8,1
DOS was cool way back when. Still freaks some younger ppl out when I go to the command line for something vague.
Used to be a fun one. Netsend in the command line, then they’re username, then the message. We did one one time that said all data will be deleted. Press enter. We could hear the cussing across the room.
-
I watched and enjoyed the 1st and 2nd season of Halt. It kinda went off the rails after that.
-
I didn't really like the time jump, not sure what season that was. But I got used to it, and still enjoyed it up to the end.
-
I had a C64, in fact I still have it. I got it about 1985 or ‘86. Great little computer. Load *.*, 8,1
DOS was cool way back when. Still freaks some younger ppl out when I go to the command line for something vague.
Used to be a fun one. Netsend in the command line, then they’re username, then the message. We did one one time that said all data will be deleted. Press enter. We could hear the cussing across the room.
Load *.*, 8, 1 lolz, I remember that. Still don't know what it means.
Me and my cousin, circa about 8th grade or so, erased the hard drive from DOS on my uncle's computer because....well....I think we just wanted to see if it would work. It did. My uncle didn't have the heart to do anything to me, but I heard my cousin got Vietnam-vet-cussed after I left. My dad thought it was hilarious, though he warned me not to do it to our computer or he'd end me. Speaking of history.....we both probably almost were history on account of that.
-
Linux guys just think they're the rock stars of the UNIX world. They sniff their own farts. Screw 'em!
I thought Linux was a separate language from UNIX.
-
I thought Linux was a separate language from UNIX.
Linux is a variant of UNIX. It is open-sourced and not proprietary like Solaris (Sun Micro), or AIX (IBM), or HP-UX (HP), or others.
-
Yeah, Linux is a LOT easier to use than it was 20 years ago. Honestly in many ways it can be the type of thing that if you want someone to have a PC that's largely just a dumb terminal / web browser, I'd rather set up a Linux system than Windows because they don't know enough to screw it up, and there's not a large group out there trying to write Linux viruses. It's like a Chromebook, but completely open and not locked into the Google world.
But now Linux is pretty simple to do most everything using the GUI. But if you know how to get under the hood with command line, you get WAY more control than that.
Yeah, and it's real lightweight too, which is why I stick it on my old PC's once the hardware is obsolete. I've had trouble with Linux natively recognizing the touchpad on Dell laptops, but it's gotten easier and easier over the years to load the driver. I want to say the last time I installed it, it recognized the problem and offered to get the driver for me. It's been a couple years, I might be misremembering. I like the Mint distro. Ubuntu is probably the "flagship"--if Linux has such a thing--and I hear it can do more stuff, but I think Mint is the easiest for somebody coming from the Windows world, so it's mostly what I've used. It also has native apps that I really enjoy. Rhythmbox, for example, is a fantastic music manager for my tastes. It's everything iTunes used to be before Apple lost its damn mind in 2011 starting with iTunes version 11. I despise iTunes now as a music manager, but Rhythmbox on Linux? Wonderful.
I wouldn't mind learning Linux, I'm just not motivated to do it. There's no work benefit for me, and I've never come across something in personal use that made me wish I knew how to utilize the command line. Maybe if I knew more about what's possible I'd be more interested. I pretty much stay away from it unless I'm trying to fix some little issue and I've looked up how to do something.
I have to use the command prompt in Windows sometimes for Python stuff, for work and when I was in school. But I mostly don't use that either.
When I gather the money, I want to build a new desktop with 4 hard drive spaces. I don't like dual-boot, partitioned-drive setups because I've found it causes some glitches on the Windows side. I'm gonna install Windows on one drive, Linux on another, and my files will be located on a third so I don't have to reload everything every few years when a Linux distro stops being supported and I have to install a new version. I'll leave the 4th blank, but eventually I'd aim to make myself a Hackintosh. For as much as I dislike Macs, it does have music software I like that is simply not available on any other platform.
-
Apropos...
If you're bad with computers, it doesn't mean you're just not good with tech. You're probably just stupid.
https://scitechdaily.com/new-study-a-lack-of-intelligence-not-training-may-be-why-people-struggle-with-computers/
-
I never know how to do do something on Linux, but yet I'm almost always right on the first guess for where to find something.
.
What does Lucy's brother in Peanuts have to do with being a cool geek??? So whimsical,sheesh
-
Apropos...
If you're bad with computers, it doesn't mean you're just not good with tech. You're probably just stupid.
https://scitechdaily.com/new-study-a-lack-of-intelligence-not-training-may-be-why-people-struggle-with-computers/
I want you to know I take that personally and I am not amused :96:
-
What does Lucy's brother in Peanuts have to do with being a cool geek??? So whimsical,sheesh
I think you mean Linus. I like him too. He has a security blanket, my folks tell me I did something similar as a toddler. That makes me a cool geek. That's the connection.
-
I wouldn't mind learning Linux, I'm just not motivated to do it. There's no work benefit for me, and I've never come across something in personal use that made me wish I knew how to utilize the command line. Maybe if I knew more about what's possible I'd be more interested. I pretty much stay away from it unless I'm trying to fix some little issue and I've looked up how to do something.
I have to use the command prompt in Windows sometimes for Python stuff, for work and when I was in school. But I mostly don't use that either.
For me it's mainly just that it's free, I know how to use it, and some of the "odd" use cases I have can be done easily. I keep one mini PC that is essentially my home "server". That was the thing I was complaining out a few weeks ago because the hardware had died and I needed to buy a new one, and ran into some weird technical issues trying to load Linux on it because of an old corrupted USB stick.
For example, I was playing around for a while with something called RaspberryPints. It was a Raspberry Pi based web server that would allow you to broadcast your beer tap list to a laptop, tablet, or phone. The Raspberry Pi was... Unstable. It would get corrupted from time to time, requiring a complete reinstall. But because I had a Linux server sitting there connected to my router, I was able to simply install the apache web server and the RaspberryPints web content onto it, and suddenly I had a much more stable web server for my tap list.
Could I do that in Windows? Probably. But the sorts of projects that people create for this sort of thing are usually Linux geeks like me, so being native to Linux just makes life easier for me.
When I gather the money, I want to build a new desktop with 4 hard drive spaces. I don't like dual-boot, partitioned-drive setups because I've found it causes some glitches on the Windows side. I'm gonna install Windows on one drive, Linux on another, and my files will be located on a third so I don't have to reload everything every few years when a Linux distro stops being supported and I have to install a new version. I'll leave the 4th blank, but eventually I'd aim to make myself a Hackintosh. For as much as I dislike Macs, it does have music software I like that is simply not available on any other platform.
This portion would be a good use for a NAS for backup, instead.
And now that I'm working through this miniPC build, I'm going through and finally cleaning up a bunch of disparate personal file sources (partly pictures, but other stuff too). I want to get it fully organized for backup purposes.
That miniPC will have local storage, will have a USB HDD dock, and then I also have a NAS. With Linux, it's trivially easy to set up a cron job (regularly scheduled action) that will use a command line based rsync command that will make sure that all copies of these personal folders remained synchronized so as I add new files or pictures, I won't have to manually maintain both the local but also the backup copies. And then I'm looking at using cloud storage [final safety location for my personal files in case my house burns down lol], and many of the cloud backup storage providers have Linux clients for synchronizing with their service too.
If your files are important, I'd rather they be stored both on your primary storage / desktop (one copy) and on external storage or NAS (second copy) and on an offsite location (third copy). It's the 3-2-1 rule for backup.
-
I want you to know I take that personally and I am not amused :96:
Believe me, if you're dual-booting PCs with Windows and Linux and talking about setting up a Hackintosh...
...you're good with computers.
The fact that you're not a command-line expert doesn't change that.
-
This portion would be a good use for a NAS for backup, instead.
And now that I'm working through this miniPC build, I'm going through and finally cleaning up a bunch of disparate personal file sources (partly pictures, but other stuff too). I want to get it fully organized for backup purposes.
That miniPC will have local storage, will have a USB HDD dock, and then I also have a NAS. With Linux, it's trivially easy to set up a cron job (regularly scheduled action) that will use a command line based rsync command that will make sure that all copies of these personal folders remained synchronized so as I add new files or pictures, I won't have to manually maintain both the local but also the backup copies. And then I'm looking at using cloud storage [final safety location for my personal files in case my house burns down lol], and many of the cloud backup storage providers have Linux clients for synchronizing with their service too.
If your files are important, I'd rather they be stored both on your primary storage / desktop (one copy) and on external storage or NAS (second copy) and on an offsite location (third copy). It's the 3-2-1 rule for backup.
My plan would not entail the 3rd hard drive being a backup, it'd be the primary file storage....files to be accessed from either the Windows or Linux side, so I don't have one set of files in one environment and another set of files in the other environment, and so I don't have to duplicate all the files on both sides. That way, I boot into whichever OS I want to use, but all my files are there for me. Seems like I should be able to get away with smaller drives for the two I just want to put the OS on.....I don't plan on filling them up with files.
A NAS would still seem to be a good backup. I have a Synology NAS with two drives, but I'd need to replace them with something much bigger. They're old drives we found in my wife's closet from who-knows how long ago, and they're only like 500 GB, I think. They're set up in RAID 1, or whatever array type mirrors the drives in case of failure. I only keep my music on them, because I don't think all our videos, photos, etc. would fit. And the Synology software plays nice with the Plex app on the living room Firestick, so I can access my music collection in the living room through the nice studio speakers I have hooked up to the TV.
I don't know about paying for off-site backup. I see the wisdom in it, but it depends on how much it is, and also I don't trust other companies with my data. Google, Facebook, and Microsoft already have more on me that I'm comfortable with. I have an external backup drive that I update to every couple months, and it stays in a fireproof/waterproof safe. I've not yet felt the urge to do off-site cloud storage.
Have you ever messed with Wine in Linux? I never have. I used to hear it was glitchy, but I've also heard in the last few years its gotten better. Depending on the capability and reliability there, maybe I wouldn't even need a Windows drive.
Also, maybe we should section off these posts to a new tech-nerd thread. I feel like I'm derailing the history thread pretty severely at this point.
-
I split it out.
For the NAS, if it's that old, with drives that old, you might want to replace the NAS, not just the drives.
I've never messed around with WINE. I mean, I like wine, but I've never really tried WINE. For me, I've been happy enough with what I can do in Linux that I haven't cared to do anything with Windows.
-
The NAS is pretty new. Just the drives are old.
If I could use the MS Office suite in Linux, I might not care for Windows at all. I assume the cloud version of 365 would be fine on Linux, but I like the apps on my PC. Can't remember where, but I believe I used the cloud versions before and it was different than the local-based programs. I guess I'd also need to look into how my IDE's work (if they work) on Linux.
Other than that, honestly, I basically just goof off on the interwebz. Can't really think of much I do that has to be on Windows.
-
The NAS is pretty new. Just the drives are old.
If I could use the MS Office suite in Linux, I might not care for Windows at all. I assume the cloud version of 365 would be fine on Linux, but I like the apps on my PC. Can't remember where, but I believe I used the cloud versions before and it was different than the local-based programs. I guess I'd also need to look into how my IDE's work (if they work) on Linux.
Other than that, honestly, I basically just goof off on the interwebz. Can't really think of much I do that has to be on Windows.
Is what you're doing with MS Office a work-related thing? IMHO if not, most of what you want can probably be handled in LibreOffice.
-
Is what you're doing with MS Office a work-related thing? IMHO if not, most of what you want can probably be handled in LibreOffice.
Not usually, as work has to be done on the work laptop.
However, I've used Libre and Apache OpenOffice over the years, and while they're way better than nothing, I have run into functionality problems in both the Word and Excel equivalents. I only briefly messed with the slide presentation software, but it seemed painfully behind PowerPoint, and I have done many personal slide presentations, and plan to do more.
I will admit, OpenOffice is highly impressive for something that's free, and I've always had it on my Linux machines.
-
Yeah, I don't know that I'd base a business around Libre/Open as they're behind MS on functionality, but then again I'm not trying to... The question for me is "are they close enough for what I need?"
-
I think you mean Linus. I like him too. He has a security blanket, my folks tell me I did something similar as a toddler. That makes me a cool geek. That's the connection.
Tiger rag?
-
Load *.*, 8, 1 lolz, I remember that. Still don't know what it means.
Me and my cousin, circa about 8th grade or so, erased the hard drive from DOS on my uncle's computer because....well....I think we just wanted to see if it would work. It did. My uncle didn't have the heart to do anything to me, but I heard my cousin got Vietnam-vet-cussed after I left. My dad thought it was hilarious, though he warned me not to do it to our computer or he'd end me. Speaking of history.....we both probably almost were history on account of that.
Best I recall it means to load the equivalent of the .exe file. 8 means disk drive, and 1 means which one because you could have multiple drives. I never knew what the 2nd and other drives would be good for, and damn we’re they slow and noisy.
-
It’s funny because people are still out there running old C64’s and creating new hardware. You can buy a USB stick that will adapt, there is or was a hard drive. A few enterprising spirits have even surfed the web in some kind of text only slow mode.
I never saw a Mac at any of my friends houses except one guy in HS who was more of a casual friend. It was so much more capable than the pc’s of the day it blew my mind, and it was already 4-5 years old.
Honestly, the early 1980’s was such a great time to be growing up. Computers were just starting to become common, and there was so much choice in the early days. Apple ( II or Mac), C-64 and then later the Amiga, PC compatible, Atari 400, 800, etc.
Bulletin boards, disk drives, dial up modems.
-
I think you mean Linus. I like him too. He has a security blanket, my folks tell me I did something similar as a toddler. That makes me a cool geek. That's the connection.
MDT /s ~???
-
It’s funny because people are still out there running old C64’s and creating new hardware. You can buy a USB stick that will adapt, there is or was a hard drive. A few enterprising spirits have even surfed the web in some kind of text only slow mode.
I never saw a Mac at any of my friends houses except one guy in HS who was more of a casual friend. It was so much more capable than the pc’s of the day it blew my mind, and it was already 4-5 years old.
Honestly, the early 1980’s was such a great time to be growing up. Computers were just starting to become common, and there was so much choice in the early days. Apple ( II or Mac), C-64 and then later the Amiga, PC compatible, Atari 400, 800, etc.
Bulletin boards, disk drives, dial up modems.
Yup. There was a sense of discovery with all the new technology, rather than just the feeling of utility that exists now.
-
Early 80's was great in High School and college
didn't touch a computer in high school and it was 2nd or 3rd year in college until we were forced to go to the library to use a computer to register for classes
that was it
-
We had telephone registration for university classes, which started my freshman year and was considered very high tech.
I don't know when they switched to online registration. I think I still used the phone registration when I went back for grad school in 2001-2003, but I honestly don't remember.
-
Yup. There was a sense of discovery with all the new technology, rather than just the feeling of utility that exists now.
I was always disappointed I never got to play around on any Amiga or Atari computers. It is said that the Amiga was a very good computer for it's time, and it had capabilities that were years ahead of it's competitors with regards to video editing and graphics. There was one expensive add-on called the "Video Toaster" (what a horrible name) that was used in several TV shows and movies to do the SFX well into the 1990's. Commodore/Amiga could have very well given WinTel a run for their money in the early to mid-90's if they hadn't been so poorly managed.
Speaking of poorly managed, it's such a shame that Atari didn't survive the way Apple did. They had a hit with the 2600 and their arcade games and then released flop after flop after that. I only played the 5200 once or twice, the controller was horrendous. It was like they tried to make the most horrible controller ever in the history of controllers. Then the 7800 was better but still not very good.
Atari was kind of a victim of corporate neglect, having been bought and sold so many times it could never keep any momentum. Once their arcade cash cow dried up that was it.
-
We had telephone registration for university classes, which started my freshman year and was considered very high tech.
I don't know when they switched to online registration. I think I still used the phone registration when I went back for grad school in 2001-2003, but I honestly don't remember.
A&M had a similar set-up, it was called the Bonfire System. Ask your wife about it, I bet she'll remember it vividly. You had to go through a phone menu (press 1 for this or that), and there was a companion book that listed all the courses and the schedule. It worked pretty good. You could connect to the university system via modem and see your schedule and grades and all that and it would update. It was frustrating because you couldn't see what class had any space so you had to keep attempting to get into a class but if the time/day was full you had to choose another and then it would mess your other schedule up. It got easier as you got seniority because you got to register earlier.
I can still hear the lady's voice in the intro : "Welcome to the Texas A&M University Student Registration System" or something similar.
I think they changed it my very last or 2nd to last semester (Spring/Fall 2000) it was all online. Worked much better.
-
Commodore/Amiga was competing more in Apple's space than in the PC world. Maybe there was room for two players in that space but I'm not so sure.
One of my theater-kid friends had an Amiga and we used it to do all sorts of video titling and special effects. In our English Lit and Theater classes, any group project that came up, we ended up making a movie out of. So when we read 1984 and Brave New World, about 8 of us got together and made the movie we called "Utopia." When we studied Shakespeare, we did a spoof about Hamlet but the main character was Ronald Reagan. I of course played The Gipper himself. There were a couple of others, my dad actually found the VHS tapes and had them transferred to digital and stored in the Cloud.
We used my dad's VHS camcorder and shot on location all over Austin. Then I used our two VCRs and did the video editing, and I borrowed my church's 8-channel mixer and dubbed all of the background music and voiceovers onto the main copy. Finally we used the Amiga to add video titles and transition effects. It was all pretty professional, it helped that both of my parents were Radio-Television-Film majors at UT and had taught me how to do all this stuff.
And these things were like 30 minutes long, we'd take one class period to see ALL the other groups' presentations, but then our group always got its own entire class period. And our teachers would show the videos to all of their other classes as well. It was really a lot of fun and mixed my theater/music/tech geek aptitudes well.
Anyway... memories...
-
Commodore/Amiga was competing more in Apple's space than in the PC world. Maybe there was room for two players in that space but I'm not so sure.
One of my theater-kid friends had an Amiga and we used it to do all sorts of video titling and special effects. In our English Lit and Theater classes, any group project that came up, we ended up making a movie out of. So when we read 1984 and Brave New World, about 8 of us got together and made the movie we called "Utopia." When we studied Shakespeare, we did a spoof about Hamlet but the main character was Ronald Reagan. I of course played The Gipper himself. There were a couple of others, my dad actually found the VHS tapes and had them transferred to digital and stored in the Cloud.
We used my dad's VHS camcorder and shot on location all over Austin. Then I used our two VCRs and did the video editing, and I borrowed my church's 8-channel mixer and dubbed all of the background music and voiceovers onto the main copy. Finally we used the Amiga to add video titles and transition effects. It was all pretty professional, it helped that both of my parents were Radio-Television-Film majors at UT and had taught me how to do all this stuff.
And these things were like 30 minutes long, we'd take one class period to see ALL the other groups' presentations, but then our group always got its own entire class period. And our teachers would show the videos to all of their other classes as well. It was really a lot of fun and mixed my theater/music/tech geek aptitudes well.
Anyway... memories...
A true Nerd's nerd.
-
Best I recall it means to load the equivalent of the .exe file. 8 means disk drive, and 1 means which one because you could have multiple drives. I never knew what the 2nd and other drives would be good for, and damn we’re they slow and noisy.
My understanding was that the Commodore 64 basically didn't have a hard drive. The disk drive was kind of it. Obviously a motherboard/processor had to be somewhere. I didn't know you could do multiple drives, or how that would work. My friends only had the one floppy disk drive. If you could do any kind of work or hobby that you could save on those machines, we didn't know how. We just loaded games from a floppy, and that was basically it.
The guy I mentioned who worked with my dad is who told me that, and I have no idea if he really knew what he was talking about. He would've been considered a PC guru for his time, I know that much. In an age where most people didn't have computers, and those who did could only do the most basic of tasks with them, he was doing business, tax documents, all kinds of stuff on his home PC. Since I blamed him for us getting an IBM-compatible, I asked him what the hell. He said the Commodore was kind of a piece of trash that didn't even have a hard drive. Whether he was right or he was just an IBM shill, I couldn't say. I was like 8.
-
A true Nerd's nerd.
There is no denying that my nerd credentials are quite strong.
Electrical engineer, computer programmer, high school band and drama and choir and A/V, regularly played Dungeons & Dragons and the Ultima series of computer games, have read every Isaac Asimov, Piers Anthony, and Tolkien book ever published (plus a few hundred more), and dressed as Dr. Who for a 6th grade Halloween party.
I never wore glasses, at least not until I hit 50. That's about the only thing I'm missing.
-
I never saw a Mac at any of my friends houses except one guy in HS who was more of a casual friend. It was so much more capable than the pc’s of the day it blew my mind, and it was already 4-5 years old.
Honestly, the early 1980’s was such a great time to be growing up. Computers were just starting to become common, and there was so much choice in the early days. Apple ( II or Mac), C-64 and then later the Amiga, PC compatible, Atari 400, 800, etc.
Bulletin boards, disk drives, dial up modems.
I never saw a Mac growing up either. My intro to Macs were 1993, 8th grade, the nerd-G&T English class I was in had them. No idea how old or new they were. We did some of our work on them, but for the life of me I can't really remember how we used them or what we did with them. iirc--and I might not be--Windows was a thing by then, and we might have even had it, but I was still booting into DOS and loading Windows from there. And I'd already been dealing with DOS for a few years by then, so a visual interface was all new to me. I recall thinking the Mac interface was way more visual than what I was used to, but pretty cool. Now that I think about it, I think we did some kind of presentations with those things. Like book reports and stuff. Maybe something akin to a forerunner of PowerPoint. One of the kids figured out how to do animations and taught us to make our images move around, and I felt like Marty McFly and I'd just jumped into the future.
I was not what you'd call tech-savvy.....like, I wasn't doing stuff like utee talks about, but I still recall the same sense of wonder you guys talk about. I'm a little younger than utee, maybe you too, but even a few years after y'all, that same ethos was still permeating the kids who had access to computers. I remember my dad telling me that one day nearly everybody would have a computer. It was hard to believe him, and I certainly wouldn't have imagined laptops and tablets, and people doing work or hobbies sitting on their couch.
The summer after my 8th grade year I recall AT&T had a commercial that ran, advertising telecommunication. It featured a voice saying something like "Have you ever attended a meeting......from your vacation bungalow?" and it showed a guy in beach clothes on the deck of what was supposed to be a beach-front vacation property with the beach in the background, talking to severe-looking people in business attire on a screen. Then the voice said "You will." I remember thinking that was awesome, and I thought we were supposed to be able to do that right now. Of course, the internet, such as it was, was dial-up, and most everything I knew about was bulletin boards (I got in major trouble on those) and absolutely nobody was doing anything of the sort. After a while I thought AT&T had lied to me, was full of crap, and I basically forgot about the idea for years. I just noticed during the pandemic when Zoom became so popular, that we've had Skype and Facetime and stuff for years, and I never really noticed. The future came and I'd failed to really notice it.
-
There is no denying that my nerd credentials are quite strong.
Electrical engineer, computer programmer, high school band and drama and choir and A/V, regularly played Dungeons & Dragons and the Ultima series of computer games, have read every Isaac Asimov, Piers Anthony, and Tolkien book ever published (plus a few hundred more), and dressed as Dr. Who for a 6th grade Halloween party.
I never wore glasses, at least not until I hit 50. That's about the only thing I'm missing.
I hesitate to call foul on an honest broker such as yourself, but this time I am really tempted. Piers Anthony has to have written over 100 books, and I am a bit skeptical.
I've read a few of his, my sister liked the Xanth novels when she was younger, and I have a friend who was way into him when we were growing up. For the most part I missed out on him. The books I did read, my impression of him was amazing premises.....nobody came up with cool ideas like him.....but I didn't think much of his writing or plots. That's just me though, obviously tons of people love his works.
-
I hesitate to call foul on an honest broker such as yourself, but this time I am really tempted. Piers Anthony has to have written over 100 books, and I am a bit skeptical.
I've read a few of his, my sister liked the Xanth novels when she was younger, and I have a friend who was way into him when we were growing up. For the most part I missed out on him. The books I did read, my impression of him was amazing premises.....nobody came up with cool ideas like him.....but I didn't think much of his writing or plots. That's just me though, obviously tons of people love his works.
That's fair.
I'm about 99% sure I read everything he published before May 1994. Since then, probably not much. So that included about half of the many many Xanth novels, all of the Incarnations of Immortality, all of the Apprentice Adept, most of the Bio of a Space Tyrant, and plenty of rando additional stuff. But I mean, I've read over a thousand books and possibly double that so it's not inconceivable I would have read of all of his. My own library is about 400 books and that's just a fraction of what I've read in my lifetime.
Anyway I thought PA was an entertaining writer, never really had any issues with his stuff.
-
Ever read any of Terry Pratchett's Discworld novels?
-
My understanding was that the Commodore 64 basically didn't have a hard drive. The disk drive was kind of it. Obviously a motherboard/processor had to be somewhere. I didn't know you could do multiple drives, or how that would work. My friends only had the one floppy disk drive. If you could do any kind of work or hobby that you could save on those machines, we didn't know how. We just loaded games from a floppy, and that was basically it.
The guy I mentioned who worked with my dad is who told me that, and I have no idea if he really knew what he was talking about. He would've been considered a PC guru for his time, I know that much. In an age where most people didn't have computers, and those who did could only do the most basic of tasks with them, he was doing business, tax documents, all kinds of stuff on his home PC. Since I blamed him for us getting an IBM-compatible, I asked him what the hell. He said the Commodore was kind of a piece of trash that didn't even have a hard drive. Whether he was right or he was just an IBM shill, I couldn't say. I was like 8.
Our C64 didn't even have a hard drive. It had a tape drive. It was TONS of fun to wait for the computer to read the cassette tape with the program, sequentially, for @ 20 minutes before it got to the part of the tape that had your program on it, before you could do *anything*.
The C64 did also have a place to load a cartridge into the back, and there were games that were cartridge-based. We had Solar Fox (https://en.wikipedia.org/wiki/Solar_Fox) and Frogger. Those were better because they loaded immediately.
-
My understanding was that the Commodore 64 basically didn't have a hard drive. The disk drive was kind of it. Obviously a motherboard/processor had to be somewhere. I didn't know you could do multiple drives, or how that would work. My friends only had the one floppy disk drive. If you could do any kind of work or hobby that you could save on those machines, we didn't know how. We just loaded games from a floppy, and that was basically it.
The guy I mentioned who worked with my dad is who told me that, and I have no idea if he really knew what he was talking about. He would've been considered a PC guru for his time, I know that much. In an age where most people didn't have computers, and those who did could only do the most basic of tasks with them, he was doing business, tax documents, all kinds of stuff on his home PC. Since I blamed him for us getting an IBM-compatible, I asked him what the hell. He said the Commodore was kind of a piece of trash that didn't even have a hard drive. Whether he was right or he was just an IBM shill, I couldn't say. I was like 8.
Well, no, the C64 did not have any kind of a hard drive, as most other computers did not in that era. Even the IBM Compatible computers of the early 80's did not have a hard drive, as far as I can remember. I don't even think any of the computers in my school had hard drives, and I graduated from HS in '94. I remember the first time I even heard about a hard drive, it was probably in the early 90's and one of our teachers was telling us about it. She said it had a 10 (!) megabyte hard drive, and we all ohh'd and ahhh'd. 10 megabytes? What on earth would you ever need that much storage for? Remember, back in those days a floppy was 520KB, or the still new "hard floppy" was 1.44 MB. Remember, there were no digital video's, photo's, or music in the 80's and 90's. Most games were on the order of 20-30KB. I read recently that the original Super Mario Brothers game for NES was 32KB.
You could save to the C64 disk if you were working on something, I think the disk drive (1541) could be paralleled together. The disk drives actually had their own memory and processor, which is what made them so expensive. Basically a computer feeding a computer. They were damn loud too.
There was a fast load cartridge you could buy (Epyx) that made the DD much faster. Supposedly, some kind of design decision made way back to make the C64 compatible with the VIC-20 software doomed the speed of any hard drive in the future.
-
Our C64 didn't even have a hard drive. It had a tape drive. It was TONS of fun to wait for the computer to read the cassette tape with the program, sequentially, for @ 20 minutes before it got to the part of the tape that had your program on it, before you could do *anything*.
The C64 did also have a place to load a cartridge into the back, and there were games that were cartridge-based. We had Solar Fox (https://en.wikipedia.org/wiki/Solar_Fox) and Frogger. Those were better because they loaded immediately.
For some reason we never owned any cartridge games. But friends had them, and yes Frogger was great.
-
We got our first PC in [I think] 1985, and there was a 10 MB hard drive in that one.
It was actually IBM, not a clone. The IBM PC XT (https://en.wikipedia.org/wiki/IBM_Personal_Computer_XT).
-
Ever read any of Terry Pratchett's Discworld novels?
No. I've heard of them but never read them.
-
We got our first PC in [I think] 1985, and there was a 10 MB hard drive in that one.
It was actually IBM, not a clone. The IBM PC XT (https://en.wikipedia.org/wiki/IBM_Personal_Computer_XT).
Rich PPL !
All joking aside, the C64 was a $200 machine back then. The IBM was probably over $1,000 or more.
-
It's so funny to be from the era I'm from, because we started with DOS type OS, went to early GUI like Win 3.1 and Mac, then to the early smartphone era (Blackberry etc), then to the smart phone and tablet era, and now to whatever is next. A lot of youngsters that come to work for my company (non-degreed of course) have no PC skills at all. It blew my mind that I knew so much more about PC's and Wintel systems. Simple tasks like setting up printers, getting the internet to work, all kinds of misc settings and configurations. Go find somebody under 30 and pull up the command prompt and show them how to use it. They have no idea. Now, you get on the phone or tablet and they know everything, but most companies don't run on phones and tablets.
-
We got our first PC in [I think] 1985, and there was a 10 MB hard drive in that one.
It was actually IBM, not a clone. The IBM PC XT (https://en.wikipedia.org/wiki/IBM_Personal_Computer_XT).
We got our IBM-compatible, I'd say, probably around '88 or '89. But maybe as late as '90. I don't have a clear association in my memory with what grade I was in, so I can't remember for sure. As many times as I saw it boot up I should be able to remember the specs, but I'm just guessing when I say it said 256k memory, but I think that's right. I don't know how much hard drive space. We replaced that around '92 or '93 with a machine that had the slick, new Windows 3.1 pre-installed. That one had 4 MB of RAM and I was hyper-impressed and couldn't imagine what could possibly use that much memory. The hard drive was 200 MB, but my brother-in-law, who is quite a tech-nerd himself, did something he called "stacking" the hard drive, and increased its capacity to 400 MB. To this day I have no idea what that is or what he did. I'm not aware of any procedure to be done on a modern hard drive that can double its storage capacity. I only know he wasn't making it up. Without changing the hard drive, the specs it listed did double.
Unless the bastard knew some kind of way to make it say something different than what it actually was. Which is not out of the question. That idgit didn't even finish high school and has never read an interesting book in his life, but give him a technical manual and he eats that crap up. He learned multiple programming languages and really learned his way around hardware, all self-taught, and eventually helped start the tech office for the Sheriff's Dept. in Baton Rouge where he'd been a cop for years. After he left law enforcement in 2007 he's done networking and programming for municipalities and private companies......and he's completely worthless as far as learning anything from.....dude can't explain anything to save his life, and has no interest. He also doesn't understand what I went back to school for, he heard the word "coding" and thinks I'm a programmer now. I tried to explain to him that my coding ability is mostly limited to data retrieval, manipulation, and ML algorithms, but every time I see him now he shows me something he's working on, which just looks like The Matrix to me, and expects me to understand it. I just smile, nod, tell him good job, and wonder why he doesn't understand I don't know what the hell I'm looking at. And he doesn't even read any cool books. He's the worst kind of tech-nerd. The kind you can't can't learn from and has no other nerd aspects that make him fun to talk to.
-
I bought my first computer (and the first computer anyone in our family owned) in 1982 using my yard work/lawnmowing money, it was a Timex Sinclair 1000. The 1000 meant, it had 1,000 bytes of memory. 1K. And the video memory was shared with system RAM, so if you weren't careful, your instruction set from your program, could overwrite video memory, and then you couldn't see what you were doing.
I bought my second computer (and the second computer anyone in our family owned) probably the next year, it was an Atari 400. It had a whopping 16K of RAM and I also got the external cassette tape drive peripheral.
Then our family finally bought an Apple IIc in 1984 and that's what we all used until I went off to college in 1990. It had a built in 5.25" floppy drive and we got an external one as well. And we had a daisy wheel printer, so no crummy looking dot matrix papers for US! Which was good because my teachers wouldn't accept dot matrix printing. If you didn't have a daisy wheel true resolution printer, your work was expected to be typewritten.
In elementary school we had Apple IIes but in middle school and high school we used PC compatibles for our work. In high school I learned Pascal on the PCs but I learned FORTRAN by telnetting into the UT Taurus dual cyber mainframes from CDC. I was lucky and could use our Apple IIc to dial up and gain entry, but less tech-fortunate friends had to use the teletypes at the school to gain access. At least we didn't have to use punch cards!
In college my friend and roommate had a Mac, and we also used Macs for our Pascal programming class (UTEE wouldn't switch to C as its base computer class for another couple of years, so I had to learn that one on my own). But my junior year I used some of my scholarship stipend and bought a killer PC system with a 486 DX2/66. That thing was SCREAMING fast. Worked great for playing Doom.
-
We got our IBM-compatible, I'd say, probably around '88 or '89. But maybe as late as '90. I don't have a clear association in my memory with what grade I was in, so I can't remember for sure. As many times as I saw it boot up I should be able to remember the specs, but I'm just guessing when I say it said 256k memory, but I think that's right. I don't know how much hard drive space. We replaced that around '92 or '93 with a machine that had the slick, new Windows 3.1 pre-installed. That one had 4 MB of RAM and I was hyper-impressed and couldn't imagine what could possibly use that much memory. The hard drive was 200 MB, but my brother-in-law, who is quite a tech-nerd himself, did something he called "stacking" the hard drive, and increased its capacity to 400 MB. To this day I have no idea what that is or what he did. I'm not aware of any procedure to be done on a modern hard drive that can double its storage capacity. I only know he wasn't making it up. Without changing the hard drive, the specs it listed did double.
Wow, old memories... When you mentioned it I vaguely remembered doing something similar, and it wasn't actually doubling the space but it was compressing files to make the HDD appear larger.
Some googling brought me to the original "Stacker": https://en.wikipedia.org/wiki/Stac_Electronics
I think when I did this, it might have been the Microsoft version, DriveSpace or DoubleSpace: https://en.wikipedia.org/wiki/DriveSpace
I can imagine that this would have seemed like black magic to a non-techie :57:
-
In college my friend and roommate had a Mac, and we also used Macs for our Pascal programming class (UTEE wouldn't switch to C as its base computer class for another couple of years, so I had to learn that one on my own). But my junior year I used some of my scholarship stipend and bought a killer PC system with a 486 DX2/66. That thing was SCREAMING fast. Worked great for playing Doom.
Oh yeah, forgot to mention that 2nd PC I mentioned from 92-93 was the shiny new 486. I played a lot of Wolfenstein on it. As I recall, it was still better to boot into games like that from DOS because it was way faster than waiting for Windows 3.1 to load it.
Windows had a lot of fun, small, pre-installed games back then I wish they'd bring back. One I particularly enjoyed was called Fences, I think. I don't remember exactly when they started including chess, but I had fun getting my butt whipped by that for many years. I now realize the Windows chess program has a pretty low Elo rating, and I can hold my own against it.
But around that time I got a Super Nintendo and that constituted most of my gaming from there on out. The world of PCs became mainly a utilitarian thing for me.
-
Then our family finally bought an Apple IIc in 1984 and that's what we all used until I went off to college in 1990.
Also, your ancient ass is older than I thought you were. Damn.....you gonna make it to next year, or what?
-
hah!
-
Also, your ancient ass is older than I thought you were. Damn.....you gonna make it to next year, or what?
Again...
University of Texas Electrical Engineer 1994.
1994 was my graduation year for undergrad. I spent 4 years in college (no 5 or 6 year plan for me, my scholarships only lasted for 4 years and anything beyond that would have been on my own dime, of which I had very few at the time).
As for your question, it's always a crap shoot at this point.
-
Yes, but for some reason it was stuck in my head for many years that you entered in '94.
Which I admit, is less likely to be a handle than a graduation year, but I know a lot of people who adopted email handles for the first time when we were freshmen in college, who used the current year (freshman year) at the end of their handle name, and by the time graduation rolled around, never changed their account names because JohnDoe97 was already what they were used to and was in all their friends' address books.
I think when I met you years ago I just filed it under "Oh, he must've entered UT back in 1994" and never thought more about it. Now I'm old and you have to tell me things several times or else I won't remember them :-D
-
You're only as old as the women you feel.
-Groucho Marx
(maybe)
-
I too graduated in 1994, but it took me 10 years.
-
What was the deal with Pascal as a computer language back in the day? We learned it in HS in one of my computer classes, and it seems like it was popular to teach. By the time I got in college, they were teaching mostly C. I have a vague memory of taking a class that taught C, but for the life of me I can't remember if it was a class about C or if C was part of the class. About 30 years ago to my reckoning (29 to be more exact) so the memory is a little fuzzy.
Anyways, I think Pascal as a language died out.
-
I was forced to learn and use Pascal and Fortran.
-
It's laughable because somewhat recently I was helping somebody in their early 20's do something on the computer, and I was CTRL-X and CTRL-C and CTRL-P and they were looking at me really weird. I asked them what was wrong, and they were bewildered at my keystroke short cuts. I had to show them how to copy, paste, cut, etc using the keyboard shortcuts. They had no clue. We learned computers with no mouse years ago, so keyboard shortcuts were the norm.
-
What was the deal with Pascal as a computer language back in the day? We learned it in HS in one of my computer classes, and it seems like it was popular to teach. By the time I got in college, they were teaching mostly C. I have a vague memory of taking a class that taught C, but for the life of me I can't remember if it was a class about C or if C was part of the class. About 30 years ago to my reckoning (29 to be more exact) so the memory is a little fuzzy.
Anyways, I think Pascal as a language died out.
I would expect that it was probably popular to teach because it was a much more "modern" programming language than BASIC. I've always viewed teaching programming as largely being separate from the language selected, because fundamentally you're teaching concepts. BASIC didn't have enough on its bones beyond very simple stuff. But Pascal had enough to teach from.
IMHO schools (especially high schools) probably kept using it at the time over C because it was more mature, there were textbooks/resources to use [and re-use each year], etc. It takes a lot more effort for a HS computer class teacher to learn a new language and then develop a teaching curriculum around it rather than just teaching the same stuff they'd used for the last 5+ years. I suspect that's why I learned Pascal in the mid-90s in HS, even though C had probably largely displaced it commercially by then.
Oddly enough I learned C, assembly, and some scripting languages in college, but never C++ or object-oriented programming. To this day I don't really know what OOP is :57:
-
What was the deal with Pascal as a computer language back in the day? We learned it in HS in one of my computer classes, and it seems like it was popular to teach. By the time I got in college, they were teaching mostly C. I have a vague memory of taking a class that taught C, but for the life of me I can't remember if it was a class about C or if C was part of the class. About 30 years ago to my reckoning (29 to be more exact) so the memory is a little fuzzy.
Anyways, I think Pascal as a language died out.
My senior year of high school I signed up for a programming class, which taught Turbo Pascal. It was a disaster, and I learned nothing. I kind of regret not going to school for programming, because the little bit I can do these days, I really enjoy. I might have liked working on larger projects, who knows. But that experience really put me off of it and kind of killed my confidence.
It was a tele-learning class, and we just had a proctor in the room, as our little high school had no teachers qualified to teach any kind of programming. There was a terminal and the instructor, a professor at Northwestern State, had the ability to share his screen with us and the other 5 schools online with us. We could hear him and had mics to ask questions, but we couldn't see him, and he was Asian and extremely hard to understand, so it was already less than ideal.
The biggest issue was that class met every day, but our high school had switched that year to an alternating schedule they called A and B days. Instead of 1st period through 7th period every day, A Days were longer classes of 1st, 3rd, and 5th periods, B Days were longer classes of 2nd, 4th, and 6th periods, and both days had 7th period for the same amount of time as always. Since that class was not taught during 7th period, we only got every other day's worth of class with the instructor. We were literally missing half of our classes. We were all doing horribly, not understanding anything, and when we complained to the Principle, he sat in on a class and said he didn't see the problem, it seemed like we could hear the instructor and see his screen just fine. He completely missed the point about "Yeah, but what about all the classes he's teaching when we're not here?" In retrospect, he probably understood, but the decision had likely been made at the Parish School Board level and there wasn't anything he could do about it.
I'm not saying that studying overtime and really kicking it into the highest possible gear couldn't have overcome that. But I am saying most of us weren't the type to learn programming on our own at 17 years old, and we all basically got sympathy-D's. We earned F's, undoubtedly. I learned nothing about Turbo Pascal because I was lost by the second week of class.
-
I've talked about this before, but one of my HS classes was using BASIC on the Apple IIgs.
By the first few days of class I realized it was a joke. I.e. the teacher would teach us what a "FOR" loop was, and then we'd have 2 1/2 weeks to "practice" it before he'd move on to the next concept. I knew I was going to be bored out of my %^$!#@& mind in that class.
So I decided that first week to start working on my final project for the class. All it had to be was a program that used every one of the concepts taught in the class, at least once, to do "something". It didn't matter what the program did.
I decided to blow that right out. There was a graphics capability on the IIgs within BASIC, so I ended up programming a version of the popular "Tank Wars / Scorched Earth" game, a turn-based game where you have a bunch of tanks on a 2D terrain and you can adjust the angle and power of your shot to try to destroy the other tanks. I included the obvious ballistics parabolic curve shape, included wind (but no other air resistance), multiple strengths of explosive rounds, etc.
I could have gotten an easy 'A' just doing the bare minimum and spending my daily class time reading a book or doing homework for other classes, but I regret nothing.
The second semester was when we started doing Pascal, and for that I didn't come up with anything interesting to do so I just coasted to the easy 'A'.
-
I would expect that it was probably popular to teach because it was a much more "modern" programming language than BASIC. I've always viewed teaching programming as largely being separate from the language selected, because fundamentally you're teaching concepts. BASIC didn't have enough on its bones beyond very simple stuff. But Pascal had enough to teach from.
IMHO schools (especially high schools) probably kept using it at the time over C because it was more mature, there were textbooks/resources to use [and re-use each year], etc. It takes a lot more effort for a HS computer class teacher to learn a new language and then develop a teaching curriculum around it rather than just teaching the same stuff they'd used for the last 5+ years. I suspect that's why I learned Pascal in the mid-90s in HS, even though C had probably largely displaced it commercially by then.
Oddly enough I learned C, assembly, and some scripting languages in college, but never C++ or object-oriented programming. To this day I don't really know what OOP is :57:
Yeah this all sounds about right to me.
I learned Pascal in high school (and then took the class but already knew it in college), but I never used it. But, it was widely used for application programming in the 70s and 80s.
I used C and C++ quite a bit in my professional career, and had to teach myself. It wasn't difficult, programming languages all use the same general commands and structures. The syntax obviously differs and some are more rigidly structured than others, but at their core they all must do the same types of things once compiled for the CPU, so they can't really differ all that much.
-
programming languages all use the same general commands and structures. The syntax obviously differs and some are more rigidly structured than others, but at their core they all must do the same types of things once compiled for the CPU, so they can't really differ all that much.
Yep.
The way I think about it is that once you learn to code, 90% of what you do transfers nearly seamlessly to learning a new programming language. You learn the basics of syntax and how things are organized, and then you're off to the races.
Of the remaining 10%, half of that is marveling at how something that was annoyingly convoluted and difficult to implement is just an absolute breeze with the structure of the new language and you don't have to bang your head against the wall doing that any more. And the other half is finding that something that was an absolute breeze in the old language is annoyingly convoluted and difficult to implement so you bang your head against the wall any time you have to do it :57:
-
Oddly enough I learned C, assembly, and some scripting languages in college, but never C++ or object-oriented programming. To this day I don't really know what OOP is :57:
Hey, I might have found a place where I have a nerd-leg up on you!
I'm a bit murkier on the more general concept, but functionally and applicably, regarding data visualization, I do know something about it. I often use a python module called Matplotlib to visualize data, and there's basically two ways to code when using it. You can do Matlab style, which was the language where Matplotlib originally came from, or OOP style. They achieve exactly the same thing, it's just that Python treats the road to get there a little differently.
Of course, Python is really just running C++ under the hood.
-
Yep.
Of the remaining 10%, half of that is marveling at how something that was annoyingly convoluted and difficult to implement is just an absolute breeze with the structure of the new language and you don't have to bang your head against the wall doing that any more. And the other half is finding that something that was an absolute breeze in the old language is annoyingly convoluted and difficult to implement so you bang your head against the wall any time you have to do it :57:
I'm not really a programmer or computer language guru, but that makes sense. I learned Python and R in school, plus SQL, which I know is technically a language, but it's really its own thing imo, and I don't put it in the same category as stuff like Python.
R, in the right environment, is crazy-useful for statistical analysis and I understand why researchers love it.
Python is basically Programming For Dummies Who Don't Understand Programming (at least I think....bear in mind I don't really know any other languages), and I see why it's so popular with data scientists. I've looked at some C++ stuff before and as soon as I realized you have to declare variable types, I b like "Nah, I'm out." Python does that automatically, or rather it interprets data types automatically as it compiles. That's what makes it so easy, so idgit-proof to learn, and so fast to code in.
It also makes it slower, because under the hood it's having to do a bunch of stuff on the back end that you traded off for ease on coding the front-end. That's why the DS world uses popular data modules like Numpy and Pandas which, among other things, streamlines the processing tasks, and effectively cheats the system so that you get the ease of the Python language but with the speed of C++. I mean, they do other useful things too, but that's a lot of it.
At any rate, I see your point, and the programming they taught us to do focused more on learning how to think algorithmically and not really grilling us on syntax. As a result--especially two years removed from school and not really using it much on the job--I often figure out what I want to do, and wind up googling/ChatGPTing some piece of code that I either can't remember or don't know how to do. But I wouldn't even be able to do that if I didn't know how to think through it in the first place. You can't believe how dumb some of ChatGPT's answers are if you just give it a general problem to code for you, even if it works.
-
Oh yeah I had to use Matlab to process output data from various labs throughout college. Just remembered that.
And most scripting languages are just shortform versions of C or Pascal or some other high level language. Primary difference is that scripting languages are interpreted real time and the high level languages are compiled for quicker and more efficient final operation.
-
Yeah, and it's amazing, as we talk about computing power, RAM capacity, storage capacity, just how much all this power...
...allows software developers to be lazy.
Which I completely understand, of course. Shipping a software product that isn't perfectly optimized for performance generates a lot more revenue than not shipping anything. And with widely disparate computing platforms these days, sometimes it's even beneficial to select an inefficient language like Java because you know it's generic / cross-platform and it's interpreted on the fly by the target rather than being compiled specifically for that hardware architecture. Sometimes that inefficiency is a necessary evil.
But it certainly leads to a lot of bloat in the aggregate...
-
Yeah, and it's amazing, as we talk about computing power, RAM capacity, storage capacity, just how much all this power...
...allows software developers to be lazy.
Which I completely understand, of course. Shipping a software product that isn't perfectly optimized for performance generates a lot more revenue than not shipping anything. And with widely disparate computing platforms these days, sometimes it's even beneficial to select an inefficient language like Java because you know it's generic / cross-platform and it's interpreted on the fly by the target rather than being compiled specifically for that hardware architecture. Sometimes that inefficiency is a necessary evil.
But it certainly leads to a lot of bloat in the aggregate...
It used to drive my crazy, having been trained to create the most efficient code possible.
But these days, the compute power and hardware overhead is SO large by comparison, it really doesn't matter all that much. Still, as a matter of principle...
-
Yeah, and it's amazing, as we talk about computing power, RAM capacity, storage capacity, just how much all this power...
...allows software developers to be lazy.
Which I completely understand, of course. Shipping a software product that isn't perfectly optimized for performance generates a lot more revenue than not shipping anything. And with widely disparate computing platforms these days, sometimes it's even beneficial to select an inefficient language like Java because you know it's generic / cross-platform and it's interpreted on the fly by the target rather than being compiled specifically for that hardware architecture. Sometimes that inefficiency is a necessary evil.
But it certainly leads to a lot of bloat in the aggregate...
Ok, you out-nerded me again. That didn't take long, my victory was short-lived.
-
You can't spell geek without EE.
-
It used to drive my crazy, having been trained to create the most efficient code possible.
But these days, the compute power and hardware overhead is SO large by comparison, it really doesn't matter all that much. Still, as a matter of principle...
Pretend for a moment that I actually have a job doing something I went to school for.
It still behooves me, in the little arena I know something about, to be efficient. I'm not saying I'm great at that. Maybe far from it. I'd doubtlessly benefit from working with people who have been doing it for years and can offer tips on efficiency. But if I'm wrangling or visualizing data with 20 million rows and 200 columns or somesuch, it helps to know how the functions operate under the hood, because some methods can save serious time and hardware-usage-hours, especially on an average computer where a lot of that stuff is still done.
And I guess it's not really the same thing, but for building ML models, knowing the math underneath is helpful, because some things are going to bog down the process horribly, in some cases the point of crashing, so it helps to understand what types of solutions should be tried for what types of problems. You can fit a really good model that's unfortunately and needlessly inefficient.
It seems to me there's still a necessity for efficiency in the analytics/ML world. But now go back to the part where I don't do that in the real world and remember that maybe I don't know what I'm talking about.
-
Pretend for a moment that I actually have a job doing something I went to school for.
It still behooves me, in the little arena I know something about, to be efficient. I'm not saying I'm great at that. Maybe far from it. I'd doubtlessly benefit from working with people who have been doing it for years and can offer tips on efficiency. But if I'm wrangling or visualizing data with 20 million rows and 200 columns or somesuch, it helps to know how the functions operate under the hood, because some methods can save serious time and hardware-usage-hours, especially on an average computer where a lot of that stuff is still done.
And I guess it's not really the same thing, but for building ML models, knowing the math underneath is helpful, because some things are going to bog down the process horribly, in some cases the point of crashing, so it helps to understand what types of solutions should be tried for what types of problems. You can fit a really good model that's unfortunately and needlessly inefficient.
It seems to me there's still a necessity for efficiency in the analytics/ML world. But now go back to the part where I don't do that in the real world and remember that maybe I don't know what I'm talking about.
If accuracy and efficiency were the sole goals when bringing a software product to market, then there'd be no issue.
But as bwar alluded to earlier, there's a time-to-market component that can't be ignored. Faster to market means more money over the lifecycle of the product. And also the more time spent on it, the more expensive the final product.
So like anything else, there's a tradeoff between quality, and timeliness. More efficient code, more accurate code, better tested code-- these are all desirable things. But if the tradeoff in time to market and/or production cost is too high, then it's not worth the effort.
And now that compute and storage and most other hardware factors are so large and powerful, the need to create small, efficient, high quality code, is diminished.
-
If accuracy and efficiency were the sole goals when bringing a software product to market, then there'd be no issue.
But as bwar alluded to earlier, there's a time-to-market component that can't be ignored. Faster to market means more money over the lifecycle of the product. And also the more time spent on it, the more expensive the final product.
So like anything else, there's a tradeoff between quality, and timeliness. More efficient code, more accurate code, better tested code-- these are all desirable things. But if the tradeoff in time to market and/or production cost is too high, then it's not worth the effort.
And now that compute and storage and most other hardware factors are so large and powerful, the need to create small, efficient, high quality code, is diminished.
Yep.
And note that this can somewhat be a "bringing a software product to market" statement, that does NOT generalize to all software.
For ML, I think efficiency is extremely important, especially as the data size scales. Mike may know more about this than I do, but if you're testing iterative algorithm changes, and you can bring the time to analyze your data set from 48 hours to 24, you can test twice as much in the same period of time, and learn more than you would with fewer iterations.
Another one that is highly important is in cloud computing. Often when people think of "software", they think of a "program" running on a "computer". But the massive increase in computing power has meant that this isn't really the case any more.
We have virtualization where you might have a very high number of "computers" running on one "computer". What that means is that a single server will be operating multiple "virtual machines" where it's basically creating a software-virtualized "computer" and operating system that you can run an application that--for all it knows--thinks it's being run on a single PC. Once you start doing this, efficiency becomes very important again. Especially if you're paying for the compute resources from a cloud compute provider.
This is then extended by containerization. This is where you take certain functions that perhaps need to be separated from each other, but at the same time you may need hundreds or thousands of them going on at any given time. Think of something like Ticketmaster when they're selling concert tickets. You may have 5,000 individual users logged in searching for tickets, and each search will be a unique experience to that user that includes the amount of time tickets they select are held for payment, their process of going through the order / credit card / etc aspect. "Containers" are used to basically replicate that process many hundreds or thousands of times at once, while also making each one independent of all the others b/c you don't want a bug or issue where suddenly you and I are both buying tickets at the same time and a glitch means I get your front-row tickets but my CC is charged my nosebleed price, and you get my nosebleed seats but your CC gets charged your front-row price. If you're doing one of something, efficiency doesn't matter. If you're doing hundreds or thousands of that same thing at once across your hardware... Efficiency is critical.
So it's not meant to be a blanket statement. It's more a statement that if the [in]efficiency of your code is someone else's problem (i.e. it's on someone else's computer), it's not anywhere near as important to you as a developer as if you're going to be the one paying for the computing power to run it at scale, whether that's on-premises or via a cloud computing service.
-
For ML, I think efficiency is extremely important, especially as the data size scales. Mike may know more about this than I do, but if you're testing iterative algorithm changes, and you can bring the time to analyze your data set from 48 hours to 24, you can test twice as much in the same period of time, and learn more than you would with fewer iterations.
Or from 3 weeks to 3 hours :57:
-
So like anything else, there's a tradeoff between quality, and timeliness. More efficient code, more accurate code, better tested code-- these are all desirable things. But if the tradeoff in time to market and/or production cost is too high, then it's not worth the effort.
please just make software that works
I don't give a damn about efficiency, let the little circle spin another couple seconds, for shit's sake
just make it work! Please!!!
-
(https://i.imgur.com/Mldnvm6.jpeg)
-
Back in the 80's I had some off-brand system. I don't remember what it was called. It had about 4 games native to the console, and that was it. No cartridges, no buying new games. It had some version of pong, a "tennis" game, and a couple other things I don't remember. Basically lines on the screen you could move with controllers. My cousins had an Atari and it was like alien-level AI and Pixar-film-worthy graphics compared to whatever that was I had.
Still had fun on it though :-D
Some years later Nintendo came out and I busied my time with an Italian plumber who kept losing his gf to a fire-breathing dragon.
-
(https://i.imgur.com/Mldnvm6.jpeg)
My guess is about 1983/84 from the price and types of consoles. Atari 5200 was a total disaster, they jumped the shark with that one. The controllers were horrid above all, I don't even think I know anybody that owned one. Coleco had some great games, it's Donkey Kong port was the best. Intellevision was a good console, a few friends had them.
Atari 2600 was pretty much the first home console (I don't count the original Fairchild F or Magnavox Odyssey, they were very few).
-
(https://i.imgur.com/5XYhnzh.jpeg)
-
I can't even imagine attempting to troubleshoot line problems in that rat's nest...
-
Some years later Nintendo came out and I busied my time with an Italian plumber who kept losing his gf to a fire-breathing dragon.
A couple of years ago, Nintendo released their NES Classic Edition (https://www.nintendo.com/en-gb/Misc-/Nintendo-Classic-Mini-Nintendo-Entertainment-System/Nintendo-Classic-Mini-Nintendo-Entertainment-System-1124287.html).
It was a very limited run, and although cheap, VERY hard to find/order. We managed to get one, though.
Great fun these days for us...
-
More than a couple of years ago, my brother really wanted whatever the latest Xbox or PS console was, but his wife was really against it. I guess she thought he'd waste too much time, even though he was by far the primary breadwinner and was always busy trying to keep the homestead and cars maintained and presentable. They subsequently got divorced and about his first purchase after the split, was the latest Xbox. And he still managed to hold down his job and maintain the household and car.
Anyway, back then, I bought him one of these for Christmas, and we actually played it a decent amount when hanging out. Our kids loved it, too.
(https://i.imgur.com/ULX7i7I.jpeg)
-
A couple of years ago, Nintendo released their NES Classic Edition (https://www.nintendo.com/en-gb/Misc-/Nintendo-Classic-Mini-Nintendo-Entertainment-System/Nintendo-Classic-Mini-Nintendo-Entertainment-System-1124287.html).
It was a very limited run, and although cheap, VERY hard to find/order. We managed to get one, though.
Great fun these days for us...
People online (who are always right) indicate that Nintendo is extremely litigious and will go hard after anyone or anything to do with emulators, which pisses off the old-school NES fans because Nintendo also refuses to issue their old platforms as demand dictates.
I think my NES "most frequent" award had to be Mike Tyson's Punchout and Contra. Never could beat Mike himself. Contra, on the other hand, I was quite good at and could get through sometimes even without the famous Konomi cheat code for extra lives. I really liked the Super Mario trilogy for the old NES as well. Even though SM2 was kinda weird and the princess had such insane athletic ability that I lost sympathy for her getting captured in the other games.
-
I was also fascinated by the weird "minus world" in the original Super Mario. To this day, I'd like an explanation for that.
An old elementary school friend I used to borrow games from and was a major game-nerd now works for Nintendo America as a game designer/software developer. He doesn't know what it was about either.
-
....the famous Konomi cheat code....
up up down down left right left right B A B A Start
since this is the nerd thread :)
-
And the funniest thing about Nintendo right now is that the company president's name is...
Doug Bowser
-
People online (who are always right) indicate that Nintendo is extremely litigious and will go hard after anyone or anything to do with emulators, which pisses off the old-school NES fans because Nintendo also refuses to issue their old platforms as demand dictates.
I think my NES "most frequent" award had to be Mike Tyson's Punchout and Contra. Never could beat Mike himself. Contra, on the other hand, I was quite good at and could get through sometimes even without the famous Konomi cheat code for extra lives. I really liked the Super Mario trilogy for the old NES as well. Even though SM2 was kinda weird and the princess had such insane athletic ability that I lost sympathy for her getting captured in the other games.
(https://i.imgur.com/37mr2XW.png)
-
This thread brings back a lot of nostalgia for me. I came of age right during the video game revolution, had an Atari 2600 in elementary school, NES by about 6th/7th grade. Commodore 64 in elementary school. I am especially fond of the NES, which really changed how home video games were perceived. Before NES, the games were just basically all about high scores and such. There wasn't really much to do, other than shoot/drive/high score. Take one of the best A2600 games, Pitfall!, you just ran one way or the other, got treasure, and kept going. It had a 20 minute timer. No real music, just a few SFX. I can still remember the very first time I played the OG Super Mario on NES. It had all the little hidden things, so many places to explore, so many levels. We literally played it for days, weeks, and months. Zelda was similar.
A lot of people thought Atari messed up because their consoles sucked, but when I looked back I realize that it was really the games that sucked. They never got past the "Arcade" model.
We also had a Sega Genesis, which was excellent as well.
-
We had the Atari 2600, and my Atari 400 computer could also be used as a gaming console, it had ROM cartridges as well. But by the time the NES was getting popular I was moving past my gaming console phase and into girls and cars. A college roommate had a Sega Genesis that I only ever played Mortal Kombat on, but even that was pretty sparing. For a brief time in 1993 or 94 I was playing some Doom on my x86 PC system but that was also pretty short-lived.
In short, I'm not much of a gamer, and never had been.
Except for standup Galaga at the arcade. I was a wizard at that.
-
Except for standup Galaga at the arcade. I was a wizard at that.
Local Pizza Hut had Galaga and Pole Position. I spent a lot of time begging my mom for quarters, not nearly enough were forthcoming for my liking.
-
Local Pizza Hut had Galaga and Pole Position. I spent a lot of time begging my mom for quarters, not nearly enough were forthcoming for my liking.
Difference in eras, I'm guessing.
Now when we go to Pizza Port Brewing Co with the kids, I'm happy to give them however many quarters they want so my wife and I can drink our beer in peace :57:
-
When our kids were little, my friend Bald Greg and I would take them to Pinthouse Pizza (brewpub in Austin) and send them over to the little video game section. They had a Ms. Pacman, a Joust, and a Rampage standup video game. We'd tell them the game had already started and they'd think they were playing it, while the demo screen was running. Saved us a bunch of quarters!
-
I quite liked the original Teenage Mutant Ninja Turtles game for NES. It was the only one that ever really made a meaningful distinction between the turtles and their weapons of choice. You could sub out turtles to get through different situations and accomplish different tasks because of the differences. The only "drawback" was the nature of the levels mostly rendered Michaelangelo and Raphael worthless, because their weapons were so short-range, though faster. Usually, you needed Donatello's bo for longer range stuff (but it was slow, that was the tradeoff), or Leonardo's swords which were a good all-around weapon, reasonably fast and with a bit of distance. The levels were interesting, you could go different places, backwards, through doors and into rooms, etc. Sometimes more cerebral than games I was used to, and the graphics were interesting.
The second one they put out seemed to have been more popular, and it was fun in a more mindless, full-on shoot-em-up kind of way, and it introduced the jump kick, which wasn't present in the first game. But all the turtles were basically the same, and the levels were nothing but side-scrolling walk-throughs.
-
Also, Legend of Zelda II
I loved that game. Somehow I never played the first Zelda release.
-
If you remember that era and know anything about the games, there's a YouTube channel called Dorkly that has a lot of funny stuff based on them.
-
Also, Legend of Zelda II
I loved that game. Somehow I never played the first Zelda release.
You just literally listed two of the hardest games ever designed for the NES.
-
Come to think of it, I don't think I ever beat Zelda II.
Now that I think about it, without a friend who had some magazines that gave a lot of tips and mapped some things out, I don't think I would've even gotten close to the end. I remember there came a point in the game where it wasn't obvious any longer what to do or where to go. There was some pretty tricky outside-the-box thinking that had to be done. Well....if you're 8, anyway. Maybe if I'd been a little older I would've figured it out on my own.
-
Oh yeah I played this on our Apple IIc
(https://i.imgur.com/yVRndWs.png)
-
Come to think of it, I don't think I ever beat Zelda II.
Now that I think about it, without a friend who had some magazines that gave a lot of tips and mapped some things out, I don't think I would've even gotten close to the end. I remember there came a point in the game where it wasn't obvious any longer what to do or where to go. There was some pretty tricky outside-the-box thinking that had to be done. Well....if you're 8, anyway. Maybe if I'd been a little older I would've figured it out on my own.
I never beat Zelda 2, and I owned the game. It was freaking hard.
-
(https://i.imgur.com/i7FhF9v.jpeg)
-
(https://i.imgur.com/n6phLTL.jpeg)
-
Oh yeah I played this on our Apple IIc
(https://i.imgur.com/yVRndWs.png)
Whut dis?
This rings a bell, but only as something I've seen somewhere and can't put my finger on.
-
Whut dis?
This rings a bell, but only as something I've seen somewhere and can't put my finger on.
Ultima II on an Apple IIe (or IIc in my case).
-
If I find myself in an arcade today, I look for Ms. Pac-Man.
I grew up on the NES, SNES, then PS2, XBox....idk
I won a tiny trophy for being the best in my daycare in Super Mario Bros. I never owned SMB2, but my neighbor did.
I was a big gamer my whole life, but only when it was too dark to play outside. I'd never pick video games over playing outside/sports.
Since starting Whoa Nellie and having no free time on my hands, I pleateaued at an extra Xbox 1 my brother gave me. The last games I played a lot were Tropico 5, Red Dead 2, and GTA 5.
My favorite games growing up were Super Mario Bros, the NCAA games, and Ken Griffey Jr baseball on the SNES. I edited all the rosters back when that was a challenge to do. The Dodgers were good because they had Piazza's rookie season (1993).
I'd play Rampage and try to do all the levels in Bubble Bobble when I had friends for a sleepover. Those were the days.
Then in high school, we'd endlessly play Goldeneye (4-player) into the wee hours.
Playing my little brother in video games was just an opportunity for him to go nuts, because of the 4-year age difference. We'd play Gauntlet and I'd shoot the food so he coudln't get it, and he'd freak out. I wasn't a very nice brother in that respect. I can still hear the phrase now: "red warrior shot the food!"
-
I definitely beat Zelda II (link). It really was super hard. Peer group playing and watching others helped a ton. I remember drawing out schematics of worlds and paths.. then some nerd showed up w some book or Nintendo power type magazine. We wanted to murder him.
Our drawings were pretty good.
-
We used to draw side-scrolling levels for fun. It was a blast.
-
I definitely beat Zelda II (link). It really was super hard. Peer group playing and watching others helped a ton. I remember drawing out schematics of worlds and paths.. then some nerd showed up w some book or Nintendo power type magazine. We wanted to murder him.
Our drawings were pretty good.
Showoff.
-
A friend of mine had a game called Jill of the Jungle for PC which was fun, from what I remember of playing it at his house. I wanted to get it, but it required a separate math co-processor chip or it wouldn't run, and my PC didn't have one.
Not long after that, all the stuff such a chip did in that era was folded into any regular processor, and those chips ceased to be a thing.
I didn't even know what that meant, and I still don't, really. I just pictured a chip in my friend's computer, doing algebra homework for some reason, while the game ran.
-
I didn't even know what that meant, and I still don't, really.
They were used for floating-point arithmetic (https://en.wikipedia.org/wiki/Floating-point_unit). As you can imagine, computers only deal in 1s and 0s. The processors of the time didn't have dedicated silicon for advanced arithmetic. Which means they had to essentially use software algorithms to do any complex mathematical operation. And software is much slower than dedicated hardware inside the chip for these operations.
So if you didn't have a math coprocessor (FPU), your computer could still do all those calculations, entirely in software. Which wasn't fast enough for those games. If you had the FPU, then it would simply route all those instructions to the FPU, and you had plenty of performance for games.
Now, as you mention, any modern processor will have that function built in.
For modern computers, the analogy would be the graphics cards (or GPUs) needed for games. Essentially the same thing--complex graphics rendering takes an extraordinary amount of computing power to emulate in software. But if you can have dedicated silicon that do the necessary functions in hardware, you can not only get it done quickly but leave the main processor (CPU) to use its resources on other things.
---------------
But... Fun story. In 2001, in my first job out of school, I worked for a company that produced programmable logic (FPGA) chips. These were "general" chips full of logic that you could use to map out complex logic and still have it run "in hardware", which was important for MANY functions if you needed the speed of hardware but whatever you were doing didn't lend itself to actually having the dedicated chips designed and fabricated to do it.
Well, one of the things it had at the time was a software-designed (known as a soft core) embedded processor function. Meaning you could emulate a processor in the logic, and use it to run software as opposed to dedicated complex logic. As it was new, the company had an internal design competition to show ways to use the processor. The group I was in... Designed an FPU to go along with it as it didn't have one natively in the design. And we tested software processing of floating-point arithmetic vs our "coprocessor", and our FPU showed a 100-fold reduction in number of clock cycles to perform calculations compared to emulating it in software.
-
I have no idea how you'd do it, but it would be neat to do a back of the napkin calculation on how much computing power your iPhone or Android has (in your pocket you carry everyday) versus the entire computing power of the world of a certain date.
For example, without knowing all the specifics of the era, I can confidently say that my (aging) iPhone 13 has more computing power than existed in the entire world in 1950. And probably also 1955. 1960-65, I would guess that I would still exceed the entire computing power of the world, but I'm not sure. 1970-75, I'd doubt it. 1980? Probably not.
I guess you'd have to estimate how much power a typical computer from back then had and then estimate how many systems they shipped etc. I read somewhere awhile back that we make more data in one day than existed in the entire history of the world until about 2003. This was a few years ago, so we might make more data in 1 hr than the rest of the world until 200x.
-
I have no idea how you'd do it, but it would be neat to do a back of the napkin calculation on how much computing power your iPhone or Android has (in your pocket you carry everyday) versus the entire computing power of the world of a certain date.
For example, without knowing all the specifics of the era, I can confidently say that my (aging) iPhone 13 has more computing power than existed in the entire world in 1950. And probably also 1955. 1960-65, I would guess that I would still exceed the entire computing power of the world, but I'm not sure. 1970-75, I'd doubt it. 1980? Probably not.
I guess you'd have to estimate how much power a typical computer from back then had and then estimate how many systems they shipped etc. I read somewhere awhile back that we make more data in one day than existed in the entire history of the world until about 2003. This was a few years ago, so we might make more data in 1 hr than the rest of the world until 200x.
This is why bwar's employer exists and persists. :)
-
My wife has an iPhone 15 and says she took 3500 photos on our trip (in addition to the thousands she already had). Just the storage for that many high res photos would be impressive.
-
I find that when I take photos that I'm not in, it becomes super uninteresting. Because you can pretty much pull the same photos from any quick search. I still take em, but they never look as good in person.
-
This is why bwar's employer exists and persists. :)
His company is just a front, so that when the aliens arrive, they'll have all our data, strengths and weaknesses, easily available for a much quicker and cleaner conquest and subjugation.
-
I read somewhere awhile back that we make more data in one day than existed in the entire history of the world until about 2003. This was a few years ago, so we might make more data in 1 hr than the rest of the world until 200x.
This is why bwar's employer exists and persists. :)
Yep. I think the factoid we recently used was that more data was created in the last 3 years than in the previous 3,000.
And that the rate of annual data creation will almost triple between 2023 and 2029.
Of course, a tremendous amount of that data is transitory and not stored long term. But the global installed data storage capacity is projected to double over that time frame as well.
The advances in AI/ML increase the ability to extract value from stored data as well, so it should be accretive to the existing projections.
-
Question for some of you more tech-saavy types. What is the deal lately where they will post clips from some TV show or movie and have the screen inverted or wavy lines or something running through the picture constantly? Is this some sort of AI trickery work-around to keep the copyright violations at a minimum or what exactly is going on?
-
A guy in my department does our federal/state reporting. He does a lot of his data analysis and creates a lot of his reports with Tableau. In talking about something else in passing, I realized he's been loading disparate files run off our database (which is it's own Frankenstacked nightmare) and doing all his joins and unions there. It's because he manages the database reports with Excel, which can't even come close to aggregating that amount of data before loading it in Tableau. Unfortunately, while Tableau can do all that stuff for you, under the hood everything is still separate, and your processor feels it.
I asked him a few questions about what he was doing and then asked if he'd share the files so I could try something. With the quickest of small data-cleaning measures, I put it all together in a few minutes for him to try loading it that way. I have never seen the guy so enthused about anything. He came over to my desk with a lot of excitement to tell me how quickly everything loaded and rendered, and how he was used to performing calculations and it taking several minutes a pop, and how amazing this is, and if he can give me all his data at the end of every semester to compile.
Thing is, this was the dumbest, most basic, Day-1 kind of data analytics "prowess" imaginable. Literally, I concatenated some data sets......yippee 🙄.....and he was acting like it was magic. Public university! We'll teach your kids how to do stuff, but we're not very good at doing stuff. Obviously the students who take classes similar to what I went to school for don't get jobs here.
It reminded me of something utee94 told me years ago when he went to work for the BBUN, about how their systems were all clunky, ancient, and inefficient. There are so many examples of that around here. This entire floor is IT, but yet we keep adding to the disfigured monster instead of really updating and modernizing things. I probably just don't understand what all would be involved with truly renovating our processes. However, I at least know what it would take to handle our department's data better, and frankly, it's not much. It just doesn't happen, for whatever reasons.
-
I manage something like that. We enter all our project data into a giant excel sheet, and then we run a macro to pull costs out of SAP and then creates a new excel file, which then gets imported back into excel. Somehow I got put in charge of this thing and the littlest change breaks it. It’s broken right now, I have no idea how to fix it. It just tells me something about a pivot table.
-
The ERP department is in our office. Or, technically, we're in their office. I think maybe they use SAP to integrate with Oracle or SQL Server or whatever they're using. They can't use Excel, though. My department luuuuurrves Excel.
-
(https://i.imgur.com/PbAXWSV.jpeg)
-
Under where it says "Regulators" I want it to say "Mount Up!"
-
I always had a problem with the resistor color coding.
Not that I didn't understand it, nor remember it, or was unable to use it.
But being color blind, I just didn't know what damn colors I was looking at!
Particularly the brown/red distinction, as red--I suffer from protanomaly--is a very dark color to me. Essentially it attenuates the red content of everything I look at. I couldn't easily distinguish it from brown, and sometimes not well from black unless it was a STARK black line. That also gets me in trouble between blue and violet, as my eyes don't properly pick up the red content in violet, so it usually just looks like another shade of blue to me.
-
me too
splicing phone cable was a toss up, needed very good light
-
I'm a lucky color non-blind kind of guy. I used to have the resistor codes memorized but that was a long, long time ago.
-
But being color blind, I just didn't know what damn colors I was looking at!
Particularly the brown/red distinction, as red--I suffer from protanomaly--is a very dark color to me. Essentially it attenuates the red content of everything I look at. I couldn't easily distinguish it from brown, and sometimes not well from black unless it was a STARK black line. That also gets me in trouble between blue and violet, as my eyes don't properly pick up the red content in violet, so it usually just looks like another shade of blue to me.
so, you don't use orange golf balls
-
A Dr told me one time, im not color blind. Just color deficient. I can see most of the colors, I just have a hard time distinguishing between certain shades.
-
so, you don't use orange golf balls
Hell no.
Any time I'm paired with someone using a red ball, I point out I'm "red-green" colorblind. As in, I could lose a red ball... on the green.
A playing partner occasionally uses red. I've almost stepped on it in the middle of the fairway, multiple times, because I can't see it against the turf.
Orange is slightly better, but I don't use it. I prefer yellow.
-
I've stepped on red & orange balls and kicked them - just can't see em
bright white seems best for me but still tough to see. I have poor eye sight
can usually see them in the air, but not down.
-
A few times a day, my computer screen goes black, except the bar at the bottom, for about ten seconds. I have no idea why, it recovers and things are fine. It's a Dell of some sort.
-
A few times a day, my computer screen goes black, except the bar at the bottom, for about ten seconds. I have no idea why, it recovers and things are fine. It's a Dell of some sort.
If you can still see the task bar, then both the screen and the video output are fine... Sounds like a software thing.
-
If you can still see the task bar, then both the screen and the video output are fine... Sounds like a software thing.
Yep.
-
Dells suck
-
Any time I'm paired with someone using a red ball, I point out I'm "red-green" colorblind. As in, I could lose a red ball... on the green.
Christmas must be very confusing for you.
-
Traffic lights that are sideways might be an issue.
-
Dells suck
You suck.
-
I'm not really sure what the alternatives are these days, without looking it up.
15-20 years ago I could name a handful of major laptop brands, but they've either all gone out of business, folded into something else, or I just don't see them anymore. Literally everything I see is Dell or Macs. Is HP still making laptops?
-
I'm not really sure what the alternatives are these days, without looking it up.
15-20 years ago I could name a handful of major laptop brands, but they've either all gone out of business, folded into something else, or I just don't see them anymore. Literally everything I see is Dell or Macs. Is HP still making laptops?
No gateway. No IBM (Lenovo has the thinkpad brand). Acer is pretty good, my son has one lasted him all through college. Asus was pretty good, haven't used them in awhile. Dell seems to be the gold standard these days however.
-
Not counting Apple, Dell's chief competitors in the US are HP and Lenovo. Other brands are Samsung, Acer, Asus.
They're all made on the same assembly lines in the same countries with pretty much the same internal components. If you value reliability and durability then you shouldn't buy the entry-level products from ANY of them. You get what you pay for.
Oh, except with Lenovo. Then you also get a lot of sneaky Chinese bullshit malware too. They provide that free of charge.
-
Windows Surface is a good computing laptop until the screen goes to shit (in about 3 years).
We went back to Dell.
-
Christmas must be very confusing for you.
My wife's given up when she's decorating the tree.
Her: Do you see any bare spots?
Me: Yeah, right there. <points>
Her: There's a red ball there...
Traffic lights that are sideways might be an issue.
No issue... Red traffic lights still look "red" to me, they're just not as bright as others would see them. And the color of green used in traffic lights is easily discernable to me that there's zero chance of confusion there. I think the fact that both are obviously lit helps significantly, whereas picking out a muted red against a dark green background is where I fail.
-
Not counting Apple, Dell's chief competitors in the US are HP and Lenovo. Other brands are Samsung, Acer, Asus.
They're all made on the same assembly lines in the same countries with pretty much the same internal components. If you value reliability and durability then you shouldn't buy the entry-level products from ANY of them. You get what you pay for.
Oh, except with Lenovo. Then you also get a lot of sneaky Chinese bullshit malware too. They provide that free of charge.
I just got my laptop refreshed at work. We did just switch brands. But the previous laptop lasted 5 years. It was definitely showing its age, but it was running.
While working with the IT guy on the refresh I overheard another employee asking the IT guy they were working with, and he said the exact same thing as utee. If you want a laptop, spend up for the good one, because it really WILL make a difference long term.
He also talked about a cool laptop company I hadn't heard of called Framework (https://frame.work/). Apparently they're modular / heavily customizable laptops, built for upgradeability with the ability for a tech geek to actually get into the laptop and install everything. I thought it was a cool concept.
-
I remember Alien Ware. They were high-end boutique laptops reputed to be very powerful and cutting edge, but I only ever knew one guy who had one and I didn't trust his opinion as far as how true it was. He was a guy with money who wanted to be different, not a guy who really knew computers.
-
I remember Alien Ware. They were high-end boutique laptops reputed to be very powerful and cutting edge, but I only ever knew one guy who had one and I didn't trust his opinion as far as how true it was. He was a guy with money who wanted to be different, not a guy who really knew computers.
They're still around--as a subsidiary of utee's employer.
IMHO they basically were/are a "spare no expense" brand for gamers. Components are components, but it was akin to basically upgrading everything. Sorta the difference between a Ford Mustang GT and a Saleen Mustang Black Label.
That's my take on it anyway...
-
I use what my company provides, an HP from 5 years ago, a ThinkPad from a year ago, and currently a Dell Inspiron 16 refurbished from their outlet.
they have all worked fine - no discernable difference to me
When I retire and need to purchase something, I'll consult this thread
-
When I retire I'll keep my same computer.
-
They're still around--as a subsidiary of utee's employer.
IMHO they basically were/are a "spare no expense" brand for gamers. Components are components, but it was akin to basically upgrading everything. Sorta the difference between a Ford Mustang GT and a Saleen Mustang Black Label.
That's my take on it anyway...
Yup, exactly right. Not only are all of the components superior, but the fit and finish is very high quality as well.
And the desktops are also easy to upgrade, my son's Alienware desktop is now 6 years old but we've been able to keep it up to date with the latest games, with just a couple of hardware upgrades to RAM and GPU, that were far more affordable than buying a new system.
-
It seems like back in the 90's and early 2000's you wouldn't dream of a computer lasting 5 years. 1-2 years, those things were toast. But also back then the difference in processors was huge after a couple of years. I'm sure the processors are faster all the time even now (Moore's law and whatnot) but maybe the existing chips are just so powerful that it doesn't make that much of a difference. You only need so much computing power for excel, chrome, and word.
-
for retirement, I'll only need chrome
-
It seems like back in the 90's and early 2000's you wouldn't dream of a computer lasting 5 years. 1-2 years, those things were toast. But also back then the difference in processors was huge after a couple of years. I'm sure the processors are faster all the time even now (Moore's law and whatnot) but maybe the existing chips are just so powerful that it doesn't make that much of a difference. You only need so much computing power for excel, chrome, and word.
This is definitely true. The majority of users are fine with limited processing power and memory. It's the CPU/GPU intensive applications like gaming, graphical rendering, and these days AI, that demand higher levels of hardware capability.
-
I've had three Dells in a row. They seem to last about 8 years, and they still work. I bought the new one because my wife bought me a nicer monitor that wouldn't attach to the old one, which I still have.
-
(https://i.imgur.com/zoyRHe5.png)
White to move, mate in 2.
-
This is tech nerd talk.
The chess nerd thread is over there ------------->
-
I'm not takin the bait
-
It seems like back in the 90's and early 2000's you wouldn't dream of a computer lasting 5 years. 1-2 years, those things were toast. But also back then the difference in processors was huge after a couple of years. I'm sure the processors are faster all the time even now (Moore's law and whatnot) but maybe the existing chips are just so powerful that it doesn't make that much of a difference. You only need so much computing power for excel, chrome, and word.
Yep. And a lot of people don't even need that. They need access to Google Sheets and Google Docs, which is all through Chrome anyway.
For most people, modern PCs are basically glorified thin clients. All the "compute" is on the other side of the network connection.
Now, that's not everyone, of course.
-
Yep. And a lot of people don't even need that. They need access to Google Sheets and Google Docs, which is all through Chrome anyway.
For most people, modern PCs are basically glorified thin clients. All the "compute" is on the other side of the network connection.
Now, that's not everyone, of course.
Not me.
-
It seems like back in the 90's and early 2000's you wouldn't dream of a computer lasting 5 years. 1-2 years, those things were toast. But also back then the difference in processors was huge after a couple of years. I'm sure the processors are faster all the time even now (Moore's law and whatnot) but maybe the existing chips are just so powerful that it doesn't make that much of a difference. You only need so much computing power for excel, chrome, and word.
This is definitely true. The majority of users are fine with limited processing power and memory. It's the CPU/GPU intensive applications like gaming, graphical rendering, and these days AI, that demand higher levels of hardware capability.
My laptop at home is a Dell, can't think of the specific line, but it's from 2012. A buddy in Austin gave it to me used in 2018. It doesn't have a lot of storage space on the hard drive, and it's true that for the most part I'm not doing anything resource intensive on it. However, as far as running Office, messing with .pdf's in Adobe, surfing the internet, and maybe playing some music at the same time, it's never had even the slightest problem. And I used it for school when I went back for data science, and it handles the Anaconda suite fine, which is known to be bloated and kind of a resource hog, and all the processing needed for complex visualization rendering and the machine learning stuff I did. I probably wouldn't want to stray too far into the real world with ML on it.....I'm sure our school work was built to be lighter to accommodate the fact we were all on our home computers, and I wouldn't want to try deep learning with it, probably. But it handles the programming languages, the associated modules, and millions and millions of rows and gobs of columns worth of data quickly, with no problems. It's by far the longest-lasting laptop I've ever had (all my previous ones were also Dells).
I can't complain about that thing at all.
-
most folks (90+%) have more computer than they need and more internet speed than they need
but, it's way better to have more than you need than not quite enuff
-
(https://i.imgur.com/CtgGjcN.png)
-
https://twitter.com/80s_channel/status/1924837107517829425?s=46&t=EHozF964Pc_xZmTZKPCcEA
I always had the same thought. Seemed out of place. As I recall in the original the ships were just flashes of light. They added in the ships in the ‘97 SE.
-
How did he even get in that thing? Does he have to climb the pole and scramble in while avoiding falling to his death, or does a helicopter drop him down in it? And what about when he has to go to the bathroom?
That's the Hazing post for sure.
Officer: *giggling* "Hey rookie. Climb up that tower and try not to die. Look out for spaceships with lasers."
Rookie: "Aw, man...do I get a blaster or something to protect myself with?"
Officer: "No. Here, take this...uh...." *looks around* "Take this spear." *more giggling*
-
Yeah I've always thought that bit was comically weird as well.
Star Wars is a space western, but it's also a space pirate movie, and that dude is up in the crow's nest.
-
George Lucas didn't really anticipate how popular Star Wars would become. They simply hoped for some modest success. Obviously Fox didn't realize it either since they let him have the merchandising rights and such. Back in that time-frame, sequels weren't really a thing, or if they were there were few and far between. The Summer Blockbuster was relatively recent as well with Jaws in 1975.
Other oddities about Star Wars: When R2D2 interfaces with a computer he sticks out a "probe looking thingy" and mechanically inserts it into a circular hole, and then mechanically rotates it. They said that it was akin to needing a key to open a door, and it needed to "do" something. Modern audiences would simply accept something like a USB stick being plugged in, no mechanical manipulation necessary. Also, when Vader asks Lea about the Death Star plans, he mentions "stolen data tapes". They've got ships that can go to light-speed, but they've got the DS plans on "data-tapes". Having them on "Data disks" would've made much more sense, but I'm not sure the floppy or anything like it was even invented then. The medal ceremony at the end is a bit odd as well, and it was put there because nobody was sure they would get a sequel so they gave the film an "ending" so it would be OK as a stand-alone film.
-
George Lucas didn't really anticipate how popular Star Wars would become. They simply hoped for some modest success. Obviously Fox didn't realize it either since they let him have the merchandising rights and such. Back in that time-frame, sequels weren't really a thing, or if they were there were few and far between. The Summer Blockbuster was relatively recent as well with Jaws in 1975.
Other oddities about Star Wars: When R2D2 interfaces with a computer he sticks out a "probe looking thingy" and mechanically inserts it into a circular hole, and then mechanically rotates it. They said that it was akin to needing a key to open a door, and it needed to "do" something. Modern audiences would simply accept something like a USB stick being plugged in, no mechanical manipulation necessary. Also, when Vader asks Lea about the Death Star plans, he mentions "stolen data tapes". They've got ships that can go to light-speed, but they've got the DS plans on "data-tapes". Having them on "Data disks" would've made much more sense, but I'm not sure the floppy or anything like it was even invented then. The medal ceremony at the end is a bit odd as well, and it was put there because nobody was sure they would get a sequel so they gave the film an "ending" so it would be OK as a stand-alone film.
I don't mind the apparent anachronisms like R2 turning the key or references to data tapes, because as you point out, the general public at the time would relate better to those ideas. Floppy disks were around since the late 60s but in 1977 most people still wouldn't know about them. Personal computers weren't yet widespread and the early ones didn't have disk drives anyway. The image of computers in most folks' minds were these large room-dominating mainframes with lots of reel to reel tape drives for storage.
And not only did they not know if there would ever be a sequel, they thoroughly expected there not to be one. That's why all of the later shoe-horning of the "Vader betrayed and murdered your father... from a certain point of view" and the Luke/Leia brother-sister thing all seem awkward and forced-- because they were.
-
I always wondered how much of the original Trilogy George Lucas really had mapped out. Did he always plan to have Vader be Anakin Skywalker and Luke's father, or did he conceive it later? Information from on-line sources is spotty, and I've never seen anything that says he had all 3 movies/scripts wrote out ahead of time.
One of my earliest memories is going to the theater to see Star Wars. I'm told it was the first movie I ever went to, being born in '75. I vaguely recall being in the theatre, and vaguely recall certain scenes. I only learned recently that they re-screened the movie several times so it's possible I was a lot older than 2 when I saw it, possibly as old as 3-5. Late 70's/80's...what a great childhood.
-
I always wondered how much of the original Trilogy George Lucas really had mapped out. Did he always plan to have Vader be Anakin Skywalker and Luke's father, or did he conceive it later? Information from on-line sources is spotty, and I've never seen anything that says he had all 3 movies/scripts wrote out ahead of time.
One of my earliest memories is going to the theater to see Star Wars. I'm told it was the first movie I ever went to, being born in '75. I vaguely recall being in the theatre, and vaguely recall certain scenes. I only learned recently that they re-screened the movie several times so it's possible I was a lot older than 2 when I saw it, possibly as old as 3-5. Late 70's/80's...what a great childhood.
Yeah, I'm sure I saw Star Wars in the theater--and I'm sure I didn't see it in 1977, when I would have been three years old. Probably closer to '80 or so, maybe in anticipation of Empire, which I also know I saw in a theater? Taking a look at the Google, I'm guessing I saw both (including Empire) in their 1981 theater re-releases. My brother would have been 13, and I would have been old enough to remember them...'79 is also a possibility.
-
I consider it maybe my very earliest memory, so I'm thinking I could have seen it in '77 but most likely '78 or '79 (3-4 yo). I only remember the scene where they swung across the bridge.
-
Yup they re-released them several times. Original Star Wars was May 1977 and I know I saw it then, but it was also definitely showing in theaters at Christmastime in 1977 because I have pictures of my 6th birthday at the Fox Triplex Theater in Austin, when we went to see Star Wars. My birthday is Dec 7.
Searching the internet, there's no record of a Christmas re-release, so it either wasn't widespread, or it had just stayed in theaters for all that time. Which is a possibility, movies used to have a much longer theater run than they do now, for various reasons.
-
Yup they re-released them several times. Original Star Wars was May 1977 and I know I saw it then, but it was also definitely showing in theaters at Christmastime in 1977 because I have pictures of my 6th birthday at the Fox Triplex Theater in Austin, when we went to see Star Wars. My birthday is Dec 7.
Searching the internet, there's no record of a Christmas re-release, so it either wasn't widespread, or it had just stayed in theaters for all that time. Which is a possibility, movies used to have a much longer theater run than they do now, for various reasons.
Found this from Wiki:On July 21, 1978, while still showing in 38 theaters in the US, the film expanded into a 1,744 theater national saturation windup of release and set a new U.S. weekend record of $10,202,726.[209] (https://en.wikipedia.org/wiki/Star_Wars_(film)#cite_note-216)[210] (https://en.wikipedia.org/wiki/Star_Wars_(film)#cite_note-217)[211] (https://en.wikipedia.org/wiki/Star_Wars_(film)#cite_note-218) The gross prior to the expansion was $221,280,994. The expansion added a further $43,774,911 to take its gross to $265,055,905. Reissues in 1979 ($22,455,262), 1981 ($17,247,363), and 1982 ($17,981,612) brought its cumulative gross in the U.S. and Canada to $323 million,[212] (https://en.wikipedia.org/wiki/Star_Wars_(film)#cite_note-219)[213] (https://en.wikipedia.org/wiki/Star_Wars_(film)#cite_note-220) and extended its global earnings to $530 million.[214] (https://en.wikipedia.org/wiki/Star_Wars_(film)#cite_note-221) In doing so, it became the first film to gross $500 million worldwide,[215] (https://en.wikipedia.org/wiki/Star_Wars_(film)#cite_note-222) and remained the highest-grossing film of all time (https://en.wikipedia.org/wiki/List_of_highest-grossing_films) until E.T. the Extra-Terrestrial (https://en.wikipedia.org/wiki/E.T._the_Extra-Terrestrial) broke that record in 1983.[216] (https://en.wikipedia.org/wiki/Star_Wars_(film)#cite_note-Dirks_(2010)-223)
-
We had it on VHS and played it on our Curtis Mathis VCR (no remote). On our Curtis Mathis Color TV. Must've watched it a hundred times in the early 80's. Had ESB on VHS but it was recorded from Show-Time. Don't remember if we had RoTJ on VHS but it we did it was also recorded.
-
And as this is a tech nerd thread, I'd highlight that tape storage is still alive and kicking today.
I'd trust an LTO tape to transfer something from one location to another more than I'd trust a USB thumb drive or SD card. A higher-quality SSD, maybe, but not a consumer grade "storage card" or anything like that where I don't trust the NAND's quality. And I wouldn't want to use an HDD either as they're really not built for durability in transport when you're getting attacked by Jedi.
-
And as this is a tech nerd thread, I'd highlight that tape storage is still alive and kicking today.
I'd trust an LTO tape to transfer something from one location to another more than I'd trust a USB thumb drive or SD card. A higher-quality SSD, maybe, but not a consumer grade "storage card" or anything like that where I don't trust the NAND's quality. And I wouldn't want to use an HDD either as they're really not built for durability in transport when you're getting attacked by Jedi.
As I'm sure you know, LTO tape is still used in the long-term archival storage required in verticals like medical, financial services, and FED/government applications.
What I'm not sure whether or not you know, is that my first job at my current employer was as the Global Supply Chain manager for tape storage solutions. I was also the Global Product Engineer-- Launch and Sustaining, for those products. At the time we were still selling something like $500M/year of those products, and I think it's still over $300M/year.
-
As I'm sure you know, LTO tape is still used in the long-term archival storage required in verticals like medical, financial services, and FED/government applications.
Yep. Tape is still just under 10% of global installed data center storage capacity.
Pretty good for a technology that was "dead" 30 years ago, huh? :57:
-
And as this is a tech nerd thread, I'd highlight that tape storage is still alive and kicking today.
I'd trust an LTO tape to transfer something from one location to another more than I'd trust a USB thumb drive or SD card. A higher-quality SSD, maybe, but not a consumer grade "storage card" or anything like that where I don't trust the NAND's quality. And I wouldn't want to use an HDD either as they're really not built for durability in transport when you're getting attacked by Jedi.
Talk nerdy to me!
-
Sometimes when I'm trying to look up business records from more than 1 year ago I will have to request the records and then have to wait a few hours/days to get them. Are they being retrieved via tape? and if so, is there a robotic arm to pick the tape and put it into the right "player"? Inquiring minds want to know.
-
Sometimes when I'm trying to look up business records from more than 1 year ago I will have to request the records and then have to wait a few hours/days to get them. Are they being retrieved via tape? and if so, is there a robotic arm to pick the tape and put it into the right "player"? Inquiring minds want to know.
That sounds like the performance/latency you'd get from a tape library, yes. And yes, a robotic arm retrieves the tape to put it into the "tape drive", which is the term used rather than player.
-
Sometimes when I'm trying to look up business records from more than 1 year ago I will have to request the records and then have to wait a few hours/days to get them. Are they being retrieved via tape?
Possibly.
and if so, is there a robotic arm to pick the tape and put it into the right "player"?
Yes some of them are multi-tape units with robot retrieval systems,
Inquiring minds want to know.
(https://i.imgur.com/O5kyj1S.png)
-
In the movie "Rogue One" (A very excellent Star Wars movie) they have a scene where they go into a data vault and get the plans, and have to use a giant robot to retrieve the tapes. I thought it was an awesome homage to the original movies mention of "stolen data tapes".
-
In the movie "Rogue One" (A very excellent Star Wars movie) they have a scene where they go into a data vault and get the plans, and have to use a giant robot to retrieve the tapes. I thought it was an awesome homage to the original movies mention of "stolen data tapes".
Yup, totally agree. It was pretty clearly an intentional reference to the original anachronism but, as bwar points out with his superior data storage knowledge, perhaps not so anachronistic after all.
Of course, it does then call into the question this line from the original movie:
Don't act so surprised, Your Highness. You weren't on any mercy mission this time. Several transmissions were beamed to this ship by rebel spies. I want to know what happened to the plans they sent you.
This implies some sort of wireless transmission, but in Rogue One that is clearly not how the plans made their way onto the Rebel Blockade Runner.
-
In the movie "Rogue One" (A very excellent Star Wars movie) they have a scene where they go into a data vault and get the plans, and have to use a giant robot to retrieve the tapes. I thought it was an awesome homage to the original movies mention of "stolen data tapes".
Isn't Rogue One mostly the same people from Andor? It's been years since I saw it, but didn't it lack the Force and light sabers and all the stuff you don't like about Andor?
-
Isn't Rogue One mostly the same people from Andor? It's been years since I saw it, but didn't it lack the Force and light sabers and all the stuff you don't like about Andor?
The Force was definitely present in Rogue One. As well as a pretty magnificent Darth Vader lightsaber scene at the end.
-
Hulu has it right now. Might need to rewatch it. I remember enjoying it, while thinking it was fairly forgettable.
I know, I know.......the TV/movie thread is over there --->
-
I really enjoyed it, right up there with The Force Awakens as my favorite since the Return of the Jedi.
-
I'm not going to go into any detail beyond what's been publicly announced (for obvious reasons), but I think tech nerds would be interested this...
https://blocksandfiles.com/2025/05/12/western-digital-cerabyte/
-
Sounds expensive.
-
So, someone's finally trying to reproduce what Superman was doing nearly 50 years ago?
(https://i.imgur.com/6dAIAvn.jpeg)
-
So, someone's finally trying to reproduce what Superman was doing nearly 50 years ago?
(https://i.imgur.com/6dAIAvn.jpeg)
I really loved those Superman movies from the late 70's early 80's. Even Pt. 3. That Supercomputer scared the crap out of me.
-
Isn't Rogue One mostly the same people from Andor? It's been years since I saw it, but didn't it lack the Force and light sabers and all the stuff you don't like about Andor?
Andor is a prequel to Rogue 1. What it lacked in Force wielding and light sabers (it had them) was made up for with a really awesome Space Battle, some of which was original footage from the 1977 original that didn't make it into that movie.
-
I really enjoyed it, right up there with The Force Awakens as my favorite since the Return of the Jedi.
I generally liked the Force Awakens, but I hated the next two. And as they've aged, I've hated them more.
-
I generally liked the Force Awakens, but I hated the next two. And as they've aged, I've hated them more.
Lots of people feel that way. Probably most people feel that way if I had to guess, at least, among people that even liked TFA at all.
Personally I thought the next two were fine. They lacked a coherent story arc due to all of the director/script shuffling at Disney, which is a shame because that should have been requirement #1. Disney certainly approached the Marvel Cinematic Universe in a much more deliberate way, and there's no reason they couldn't have done the same for Star Wars.
But overall I still felt like each of them had the necessary story elements to be a proper Star Wars movie. If nothing else I think the writing and dialog were better than most of the first 6 movies. And it probably helped that 7-8-9 are, to my kids, the same as the OG trilogy was, to me. This is their Star Wars, and watching it through their eyes, was a lot of fun for me.
But in general I've found I'm much kinder to 1-2-3 and 7-8-9, than are most folks. And I'm fine with that. The movies are supposed to be entertainment, they're supposed to be fun, and I enjoyed them all for what they were.
-
I can't really see 4-5-6 with objective eyes. They're treasures from my childhood, and I'm aware that even if I saw them for the first time now and would think they suck, I'm not capable of perceiving that. So I have no objective take on them, and I'm okay with that. Plenty of things from my childhood are like that.
-
Yeah, the kids of the early 2000s-- which include my nephews-- absolutely loved the prequel trilogy. My youngest nephew born in 1990, was a huge Jar Jar Binks fan. Had the action figure, the plush stuffy, the PJs, the whole deal.
And my kids liked the prequels just fine, and really grew up with 7-8-9. They were 7 and 9 when The Force Awakens came out.
I just roll me eyes at all of the gate-keeping canon-invoking dipshits who insist on hating everything that came out after The Empire Strikes Back. Not sure how they call themselves Star Wars fans when they don't seem to actually like any of it.
-
I liked the stories of the prequel trilogy okay, but I thought it missed on the tone, or vibe, or whatever, of the original trilogy. The latest trilogy I thought captured the spirit and the tone better, but I wasn't as impressed with the stories.
Imagine that, the ones from childhood are my faves.
The second prequel movie, don't remember what it's called.....Episode 2, I guess.....did give us a Yoda light-saber fight, which I admit short-circuited the pleasure center of my brain. As a grown man of probably 21 or so, when it became apparent in the scene that Yoda was about to fight, I remember leaning over to my sister in the theater with wide eyes and saying "I've been waiting my whole life for this."
-
I generally liked the Force Awakens, but I hated the next two. And as they've aged, I've hated them more.
That's me.
And I'm not "everything new sucks" when it comes to Star Wars
I loved Rogue One, it's probably only behind Empire and Revenge of the Sith. And I'd put Solo in Tier 2 with A New Hope and Return of the Jedi. I thought the cartoons were great, and I liked the prequel trilogy as a story. It might be the best story of all of it. But the script was bad, and particularly the first 2, probably pushed CGI beyond where we were, and wind up looking bad.
But my take on some of this is that while certain movies/shows etc in a series are bad, they don't detract from the good portions. What angers me about the sequel trilogy isn't even that I think it was bad, I think it actually detracts from the older movies
-
I liked the stories of the prequel trilogy okay, but I thought it missed on the tone, or vibe, or whatever, of the original trilogy. The latest trilogy I thought captured the spirit and the tone better, but I wasn't as impressed with the stories.
I think this is a good observation.
-
I never realized that ESB was considered the best movie of the 3. I loved all 3 equally. Now that I’m older I do realize that ESB was simply done better, with a better story and photography. But ROTJ had…metal bikini Leia. So hot.
-
That's me.
And I'm not "everything new sucks" when it comes to Star Wars
I loved Rogue One, it's probably only behind Empire and Revenge of the Sith. And I'd put Solo in Tier 2 with A New Hope and Return of the Jedi. I thought the cartoons were great, and I liked the prequel trilogy as a story. It might be the best story of all of it. But the script was bad, and particularly the first 2, probably pushed CGI beyond where we were, and wind up looking bad.
But my take on some of this is that while certain movies/shows etc in a series are bad, they don't detract from the good portions. What angers me about the sequel trilogy isn't even that I think it was bad, I think it actually detracts from the older movies
Yeah that I definitely can't agree with. As I said, I think I'm kinder to the prequel trilogy than most, but there's absolutely nothing that can "detract" from Attack of the Clones. That's the one I will skip every time during a Star Wars marathon.
Committee arguments, a robot 50s diner waitress, and the awkward, clunky dialog that wasted McGregor's, Portman's, and even Christiensen's talents, are a really tough watch for me. The ONY time I ever rewatch that one, is when my kids insist on it when we're doing a marathon over the holidays or whatever.
-
I never realized that ESB was considered the best movie of the 3. I loved all 3 equally. Now that I’m older I do realize that ESB was simply done better, with a better story and photography. But ROTJ had…metal bikini Leia. So hot.
ESB is a clear #2 for me because the original, is the original. ESB was an excellent sequel, arguably the best ever, but it just couldn't compare to OG for me.
-
The first Star Wars trilogy was a cultural phenomenon. It did something that hadn't happened before, creating a through-line in three distinct movies that were blockbusters. Other than the A Fist Full of Dollars trilogy (which was more niche), I can't think of another one like this that came before. The reason it became that phenomenon was the strength of the first two movies. The third was an adequate ending, but Lucas had been bitten by the merchandising bug by then. But it was fine. Yes, they are space westerns, not really sci-fi, yes, there are problems with those movies, but, in general, the acting was solid (enough)--and highlighted by good chemistry between the actors, the story was well paced, the effects were top notch for the time, and there was a culturally relevant sub-plot that Americans connected with.
Everything that came after has to be judged against the phenomenon that made it possible in the first place. The prequels are fun-ish, do some more world building, but suffered from poor script writing, poor chemistry--particularly with two of the main characters--not living up to the world building reputation that flowed from two decades of post-trilogy writing, interviews, etc--and, yes, relied too much on CGI (that wasn't even all that advanced yet). Revenge of the Sith told the story that everyone wanted from the original trilogy, but is still marred by a poor script, poor acting, and poor chemistry. But they were still fun space westerns--with light saber duels, aliens, cool ships, etc. And, yeah, the kids from that generation liked it well enough.
For the true Star Wars nerds, there were plenty of other things to like going on: the animated serials were--I'm told--quite good, and pointed to the future we are living now with our big media serials, where you pick up a lot of plot points through the TV spin offs, sometimes to the detriment of the feature length films.
I enjoyed each of the "final" trilogy movies on their own, but they didn't present a coherent story, largely because Disney didn't have them well-mapped out in advance, and tried to be avant gard with the director choice for the second film. I liked the characters in the last trilogy better than in the prequel, the script writing was better, and the chemistry between actors was improved. But the lack of a coherent story through the three movies really holds them back as a collection. And, again, the true Star Wars fanatics can find endless things wrong with them--and the culture critics can point to them trying to do too much inclusion--at times it felt forced, and for its critics, it probably felt more than that. I'm one of the few people who liked the second film best of the three--but it certainly had some big flaws. I liked the concept that it set out to democratize the Force. But the third movie completely reversed that. I think a lot of what they did on the third movie was fan service to try to recover from the poor reception to the second one. In doing that, they fell back on bringing the whole thing back to characters from the original trilogy, which--IMO--missed an opportunity to move on from it. From the perspective of someone who read the Thrawn Trilogy, but none of the other Star Wars books, there was a post-original trilogy world with new and different villains that could have been really interesting, but instead they told basically the same story again, even ending with the same old bad guy. A missed opportunity. Each was reasonably fun in its own right, but none were great movies. They were a decent way to retire the main cast from the original trilogy but failed to introduce a new world to build from. It's unfortunate, I thought the Daisy Ridley character was a good one. But I also thought that turning her into the royal bloodline was a poor choice.
The spinoffs have been hit or miss for me. I thought Solo was a downright bad movie. I thought Rogue One was excellent--it's falls behind Star Wars and Empire on my ranking of the films. I've enjoyed the Mandalorian, but it's just a run of the mill western. I thought the Boba Fett thing was a waste of my time, which soured me on the other shows. I hear I should watch Andor, and perhaps the Obi Wan thing, but they are low on my list, largely due to Star Wars fatigue.
-
The third was an adequate ending, but Lucas had been bitten by the merchandising bug by then.
That's why I'm looking forward to Spaceballs 2: The Search For More Money :57:
-
Yeah that I definitely can't agree with. As I said, I think I'm kinder to the prequel trilogy than most, but there's absolutely nothing that can "detract" from Attack of the Clones. That's the one I will skip every time during a Star Wars marathon.
Committee arguments, a robot 50s diner waitress, and the awkward, clunky dialog that wasted McGregor's, Portman's, and even Christiensen's talents, are a really tough watch for me. The ONY time I ever rewatch that one, is when my kids insist on it when we're doing a marathon over the holidays or whatever.
I mean in the sense that ROTJ didn't end it. Palpatine was still alive, Vader's sacrifice didn't mean anything, Luke was whatever. It made the arc of the other 6 no longer the arc, but for no actual reason
-
I mean in the sense that ROTJ didn't end it. Palpatine was still alive, Vader's sacrifice didn't mean anything, Luke was whatever. It made the arc of the other 6 no longer the arc, but for no actual reason
Ah, I see. I can agree with that for the most part.
I was really saddened at the time, that one of the great love stories of cinema-- Han and Leia-- turned out to be all for naught. Not only that, but their child, the product of that love story, turned around and murdered the best movie character of all time. I found that to be disappointing but also poignant and heartbreaking in precisely the way the writers and director intended. But still, it sucked.
-
So as we started discussing a little on the obituaries thread, we often try to align things to decadal times--was something an "80s" thing or a "90s" thing. I think that's both inaccurate and lazy sometimes. At least we don't do that with generations--things like the Baby Boomers or Millennials are aligned to specific cultural eras, not a date year ending in zero.
So I'd ask what you all think about the technological eras of our time. I'll throw a few things out there regarding computing / communication, but we certainly don't need to limit it to that.
My view on some of these:
- Pre-1981 - the computing prehistoric era: This was essentially the "pre-computer" era for most people. Most were unlikely to do much with computers, even at their jobs. Nobody outside of engineers or tech geeks would have a computer at home. The "IBM PC" hadn't been released yet. Computers, for most people, basically didn't exist.
- 1981-1992 - the "PC" era: With the release of the IBM PC and then the various clones, we started to see consolidation of operating systems, and therefore software, and a rapid decline in price. However, the PCs of this era didn't really DO much. They were basically tools for things that we could already do, to make it easier. Things such as word processing (instead of using a typewriter), bookkeeping or taxes (doing it on the computer rather than by hand), etc. And generally your computer was "your computer"--it largely wasn't used to communicate. It was merely a tool in your house for tasks you need to accomplish -- and for gaming because you only have SO much work to do lol...
- 1992-1997 - the "online service" walled garden era: Now we're getting to Windows, we're getting to modems, and we're getting to... America Online! For the first time, regular people (i.e. not BBS nerds) had a way to use that PC to communicate. For many, it would be their first experience with email. However, for most people, getting on AOL was used to "get on AOL". It was not used to get onto a wider internet, which barely existed. The services were provided and curated by AOL. And if you were AOL, you weren't on CompuServe, or Prodigy, etc.
- 1997-2007 - the "world wide web" era: Here's where I think we started to see people break out of the walled garden, and the growth of web sites--including e-commerce. However, this was still a "computer-based" era. You got onto your computer to go check your email, go to a web site, perhaps read online news/blogs, etc. And at this point, you largely would "go to a web site", not having any curated content, any algorithms feeding you, or any real "platforms". The web era was that--an era of distinct and largely separate web sites.
- 2007-2018?? - the smartphone/social media era: This was the point--with the iPhone--at which everything really changed. You went from interacting with technology primarily through a device that you either had at home or in a bag/backpack, with a screen and a keyboard, to carrying the world around with you in your pocket. It was also when social media was really hitting its stride--accelerated by the smartphone. However in my opinion this is separate from what I'm about to talk about--at this point social media was still very much about connecting with friends & family, about the people in your life you already know. But this is when it moved from the "web" era to the "platform" era.
- 2018??-present - the influencer/algorithm era: As I mention above, I think this is when everything got truly supercharged and hyper-focused to push content in your face all day long. I think it differs from the previous social media era, because eventually we all realized that there's only so much we want to interact with friends and family. And the platform era is all about engagement. So they want to drip-feed you content and keep you coming back, and that content will be individualized to YOU and what they believe will cause YOU to remain engaged.
- Next - the AI era? Not sure what happens next.
Thoughts?
-
- Next - the AI era? Not sure what happens next.
Skynet.
-
One of the people I have run into in life was one of the key creators of Facebook's algorithm. It made him very wealthy. It also makes him lose sleep at night.
An engineer I know and respect a great deal believes things like that can't really be attributed/blamed on any individual. His perspective: humans are, by nature, inventors. We create before we analyze our creations. He makes this argument about nuclear weapons: he believes they were inevitable. I'm sure he would say the same thing about the algorithm form of marketing. It's an interesting world view.
-
As far as advertising or marketing is concerned, I don't have any problems with the algorithms. Why would I be upset with algorithms that target me with products or services that I like or want, rather than things I don't like or don't want?
If we're talking about the facebook echo chamber/angry algorithms, these don't upset me either, because I can easily ignore them.
It is actually possible for humans to exercise some self control and not allow themselves to be goaded and ruled by stuff that infuriates them. That's actually what mature human beings should be expected to do, as a bare minimum.
-
So as we started discussing a little on the obituaries thread, we often try to align things to decadal times--was something an "80s" thing or a "90s" thing. I think that's both inaccurate and lazy sometimes. At least we don't do that with generations--things like the Baby Boomers or Millennials are aligned to specific cultural eras, not a date year ending in zero.
So I'd ask what you all think about the technological eras of our time. I'll throw a few things out there regarding computing / communication, but we certainly don't need to limit it to that.
My view on some of these:
- Pre-1981 - the computing prehistoric era: This was essentially the "pre-computer" era for most people. Most were unlikely to do much with computers, even at their jobs. Nobody outside of engineers or tech geeks would have a computer at home. The "IBM PC" hadn't been released yet. Computers, for most people, basically didn't exist.
- 1981-1992 - the "PC" era: With the release of the IBM PC and then the various clones, we started to see consolidation of operating systems, and therefore software, and a rapid decline in price. However, the PCs of this era didn't really DO much. They were basically tools for things that we could already do, to make it easier. Things such as word processing (instead of using a typewriter), bookkeeping or taxes (doing it on the computer rather than by hand), etc. And generally your computer was "your computer"--it largely wasn't used to communicate. It was merely a tool in your house for tasks you need to accomplish -- and for gaming because you only have SO much work to do lol...
- 1992-1997 - the "online service" walled garden era: Now we're getting to Windows, we're getting to modems, and we're getting to... America Online! For the first time, regular people (i.e. not BBS nerds) had a way to use that PC to communicate. For many, it would be their first experience with email. However, for most people, getting on AOL was used to "get on AOL". It was not used to get onto a wider internet, which barely existed. The services were provided and curated by AOL. And if you were AOL, you weren't on CompuServe, or Prodigy, etc.
- 1997-2007 - the "world wide web" era: Here's where I think we started to see people break out of the walled garden, and the growth of web sites--including e-commerce. However, this was still a "computer-based" era. You got onto your computer to go check your email, go to a web site, perhaps read online news/blogs, etc. And at this point, you largely would "go to a web site", not having any curated content, any algorithms feeding you, or any real "platforms". The web era was that--an era of distinct and largely separate web sites.
- 2007-2018?? - the smartphone/social media era: This was the point--with the iPhone--at which everything really changed. You went from interacting with technology primarily through a device that you either had at home or in a bag/backpack, with a screen and a keyboard, to carrying the world around with you in your pocket. It was also when social media was really hitting its stride--accelerated by the smartphone. However in my opinion this is separate from what I'm about to talk about--at this point social media was still very much about connecting with friends & family, about the people in your life you already know. But this is when it moved from the "web" era to the "platform" era.
- 2018??-present - the influencer/algorithm era: As I mention above, I think this is when everything got truly supercharged and hyper-focused to push content in your face all day long. I think it differs from the previous social media era, because eventually we all realized that there's only so much we want to interact with friends and family. And the platform era is all about engagement. So they want to drip-feed you content and keep you coming back, and that content will be individualized to YOU and what they believe will cause YOU to remain engaged.
- Next - the AI era? Not sure what happens next.
Thoughts?
I agree pretty much across the board with these delineations and their implications. Unsurprisingly, I'd add. :)
-
So as we started discussing a little on the obituaries thread, we often try to align things to decadal times--was something an "80s" thing or a "90s" thing. I think that's both inaccurate and lazy sometimes. At least we don't do that with generations--things like the Baby Boomers or Millennials are aligned to specific cultural eras, not a date year ending in zero.
So I'd ask what you all think about the technological eras of our time. I'll throw a few things out there regarding computing / communication, but we certainly don't need to limit it to that.
My view on some of these:
- Pre-1981 - the computing prehistoric era: This was essentially the "pre-computer" era for most people. Most were unlikely to do much with computers, even at their jobs. Nobody outside of engineers or tech geeks would have a computer at home. The "IBM PC" hadn't been released yet. Computers, for most people, basically didn't exist.
- 1981-1992 - the "PC" era: With the release of the IBM PC and then the various clones, we started to see consolidation of operating systems, and therefore software, and a rapid decline in price. However, the PCs of this era didn't really DO much. They were basically tools for things that we could already do, to make it easier. Things such as word processing (instead of using a typewriter), bookkeeping or taxes (doing it on the computer rather than by hand), etc. And generally your computer was "your computer"--it largely wasn't used to communicate. It was merely a tool in your house for tasks you need to accomplish -- and for gaming because you only have SO much work to do lol...
- 1992-1997 - the "online service" walled garden era: Now we're getting to Windows, we're getting to modems, and we're getting to... America Online! For the first time, regular people (i.e. not BBS nerds) had a way to use that PC to communicate. For many, it would be their first experience with email. However, for most people, getting on AOL was used to "get on AOL". It was not used to get onto a wider internet, which barely existed. The services were provided and curated by AOL. And if you were AOL, you weren't on CompuServe, or Prodigy, etc.
- 1997-2007 - the "world wide web" era: Here's where I think we started to see people break out of the walled garden, and the growth of web sites--including e-commerce. However, this was still a "computer-based" era. You got onto your computer to go check your email, go to a web site, perhaps read online news/blogs, etc. And at this point, you largely would "go to a web site", not having any curated content, any algorithms feeding you, or any real "platforms". The web era was that--an era of distinct and largely separate web sites.
- 2007-2018?? - the smartphone/social media era: This was the point--with the iPhone--at which everything really changed. You went from interacting with technology primarily through a device that you either had at home or in a bag/backpack, with a screen and a keyboard, to carrying the world around with you in your pocket. It was also when social media was really hitting its stride--accelerated by the smartphone. However in my opinion this is separate from what I'm about to talk about--at this point social media was still very much about connecting with friends & family, about the people in your life you already know. But this is when it moved from the "web" era to the "platform" era.
- 2018??-present - the influencer/algorithm era: As I mention above, I think this is when everything got truly supercharged and hyper-focused to push content in your face all day long. I think it differs from the previous social media era, because eventually we all realized that there's only so much we want to interact with friends and family. And the platform era is all about engagement. So they want to drip-feed you content and keep you coming back, and that content will be individualized to YOU and what they believe will cause YOU to remain engaged.
- Next - the AI era? Not sure what happens next.
Thoughts?
Close. Pretty much echo's how I feel.
-
One of the people I have run into in life was one of the key creators of Facebook's algorithm. It made him very wealthy. It also makes him lose sleep at night.
An engineer I know and respect a great deal believes things like that can't really be attributed/blamed on any individual. His perspective: humans are, by nature, inventors. We create before we analyze our creations. He makes this argument about nuclear weapons: he believes they were inevitable. I'm sure he would say the same thing about the algorithm form of marketing. It's an interesting world view.
I feel like that's why God gave us sci-fi writers. They're sort of our 19th and 20th century prophets, warnings us about the dangers so we don't have to find out for real.
I note we listen about as well as people tended to listen to the Biblical prophets.
-
I think the other thing is how computer knowledge there in the "PC" era was much slower to change, and was very industry based. We got our first PC in 1992. My grandfather was 67, and had been retired for 3 years. My dad was 36. But my dad was in sales, and aside from some light Lotus work, had very little computer interaction. My grandfather retired from administration at UM, and universities had access to it before almost anyone. So even though he wasn't in a tech field, and had been retired, without a PC of his own, for 3 years, he came over regularly to help us navigate things.
-
- Next - the AI era? Not sure what happens next.
Dead Internet Theory
-
An engineer I know and respect a great deal believes things like that can't really be attributed/blamed on any individual. His perspective: humans are, by nature, inventors. We create before we analyze our creations. He makes this argument about nuclear weapons: he believes they were inevitable. I'm sure he would say the same thing about the algorithm form of marketing. It's an interesting world view.
As a history buff I can say without a doubt that his is absolutely true.
The US Atomic program (what became the Manhattan Project) got started largely because Einstein sent a letter to FDR that basically said "Hey, you really need to look at this because the Germans are WAY ahead of you on it and if it is successful it is a complete game-changer.
The Manhattan Project borrowed HEAVILY from the British Atomic project which was WAY ahead of ours circa 1941/42 and also benefitted immensely from a ton of other European scientists who fled the Nazi's because they were Jewish. This also substantially hurt the German Atomic program since they chased off a substantial portion of their best scientists.
The French, Italians, Soviets, and Japanese also had atomic research programs before and during the war. That is just off the top of my head, I'm sure there were others as well. It was absolutely inevitable that someone would eventually figure out how to split atoms and make a really big boom.
A couple side notes:
I've visited the Trinity site. That was the site of the first atomic explosion. We are coming up on the 80th anniversary of that occurrence. At 5:29 am local time on July 16, 1945 "Gadget" was detonated in the New Mexico desert not far from Alamogordo, NM. The site is only open a couple days a year which is actually very good because rather than just driving by and snapping a picture of a sign that says "Trinity test happened here" they really do up a major presentation. There are scientists and historians and a bus tour to the ranch where they put the thing together and you learn a lot about it.
One of the most fascinating stories of WWII is the story of Moe Berg being assigned to attend a lecture in Zurich (neutral Switzerland) by German Atomic Scientist Werner Heisenberg.
I have to back up and explain Moe Berg because the guy is flat out fascinating. He was a professional baseball player in the 1920's and 1930's playing catcher for the Brooklyn Robins (later became the Brooklyn Dodgers), Chicago White Sox, Cleveland Indians, Washington Senators, and Boston Red Sox. Far from the stereotype of a dumb jock, he was a genius who had graduated magna cum laude from Princeton. Anyway, during the war he wanted to be involved despite being in his 40's and too old to begin traditional military service. He eventually joined OSS (Office of Strategic Services, forerunner of the CIA) and worked on a project to capture and interrogate Italian rocket and missile scientists. In order to do this effectively, Berg had to be smart enough to understand the research.
In November of 1944 Berg was sent to Heisenberg's lecture in Zurich with orders to shoot and kill Heisenberg if anything that Heisenberg said led him to believe that the Germans were close to a bomb. Berg correctly deduced that the Germans were nowhere close to a bomb and thus did not kill Heisenberg.
If you didn't already get this from reading the previous paragraph, think about it for a minute. A US OSS officer was sent into a neutral country with orders to KILL a foreign scientist. That is a MASSIVE violation of International Law and local law and just about everything else. Had Berg actually killed Heisenberg he may well have been executed by the Swiss for murder. I explain that to explain that the US did not undertake this operation lightly. It was, however, that important that if the Germans were anywhere close to a Bomb, they HAD to be stopped.
I think the best quote about Berg to explain him is that late in the war he was assigned to go to Italy and recruit the head of the Italian supersonic research program, Antonio Ferri. When Berg (a former MLB Catcher) returned with Ferri, FDR commented "I see that Moe Berg is still catching very well."
-
One of the people I have run into in life was one of the key creators of Facebook's algorithm. It made him very wealthy. It also makes him lose sleep at night.
An engineer I know and respect a great deal believes things like that can't really be attributed/blamed on any individual. His perspective: humans are, by nature, inventors. We create before we analyze our creations. He makes this argument about nuclear weapons: he believes they were inevitable. I'm sure he would say the same thing about the algorithm form of marketing. It's an interesting world view.
I think there's a historical argument that many inventions are somewhat inevitable as a "product of their time" and we credit the person who got there first, when there might have been many who were one step away and might have gotten there in another year or three's time.
For example, in my industry a big advance was the helium-filled HDD. The issue wasn't that nobody had ever thought of using helium inside an HDD--the concept was well known and it was an engineering problem, not an idea problem. Helium doesn't like to stay where you put it, so the engineering of it all centered around how to seal it into the drive. As it stands, my company was the first to productize it in 2013--but our biggest competitor had a helium product about 2 1/2 years later. It's not like they'd JUST started it. We were just a little farther ahead.
The next truly major technological milestone is called HAMR--using a laser to heat a microscopic dot on the media surface beyond its Curie temp, which makes it easier to magnetically "write" the bit, and then once it cools it is tremendously stable magnetically. Again, this is NOT a technology that nobody has thought of--we've joked for at least a decade that every year, HAMR is "two years away". The problem isn't the concept--its the engineering to make it reliable enough for data center use with appropriate production yields and MTBF. We've been working on it, and our competitor has been working on it. Now what was always "two years away" is finally here. As it stands, they productized it first, but we've publicly announced [or I wouldn't mention it here] our first HAMR product on our roadmap.
I don't know how many inventions are truly "lightning bolts out of nowhere" sort of inventions. IMHO a great many of them are things that in scientific literature, in industry R&D circles, in the esoteric places where the geeks live, a bunch of people all know what they're trying to achieve, and are actively working towards it. But someone will always be first.
It's similar to how the calculus developed--largely independently between Newton and Liebniz. But both had access to the "state of the art" of mathematical thought at the same time, and it's not ALL that surprising that they both made the same extrapolations from that to the next logical step separately, independently, and at roughly the same time. Calculus was likely "about to be invented" by someone, and it was just a surprise that the two basically did it at the same time so we can argue over which one did it.
-
Quite a few major inventions were accidents:
7 Momentous Inventions Discovered by Accident | HISTORY (https://www.history.com/articles/accidental-inventions)
15 Of The Coolest Accidental Inventions | HowStuffWorks (https://science.howstuffworks.com/innovation/inventions/15-of-the-coolest-accidental-inventions.htm)
To these, I would add Teflon and Nylon. There are quite a few chemical reactions discovered by accident (one of no account was discovered by me, by accident).
I'd argue the "accidents" would have occured sooner or later, but the inventions were not what the person was trying to achieve.
-
One of the people I have run into in life was one of the key creators of Facebook's algorithm. It made him very wealthy. It also makes him lose sleep at night.
An engineer I know and respect a great deal believes things like that can't really be attributed/blamed on any individual. His perspective: humans are, by nature, inventors. We create before we analyze our creations. He makes this argument about nuclear weapons: he believes they were inevitable. I'm sure he would say the same thing about the algorithm form of marketing. It's an interesting world view.
As it relates to a wider point, it brings up certain questions.
I agree that to an extent we are an inventive species, and we will invent without really considering the consequences of our invention. AI, of course, is a key aspect of this.
I personally believe that if we were to achieve artificial general intelligence (AGI), it will be a short step from there to artificial superintelligence (ASI). The reason is that human intelligence is REALLY difficult to scale. To "build" a human capable of extending the knowledge in any particular field, you are typically going to need 18 years of basic development, multiple years of college, either an advanced degree or a LOT of time learning on the job in a particular field, and then and only then are they really able to do something interesting. But they're only able to do something interesting in their own field. To migrate from one field to another basically is not ALL that far removed from starting from scratch. I think I'm really good at what I do; if I wanted to reverse course and then try to be as good at what I do at something like medicine, I'd have no chance of doing it before I hit retirement age.
However, with computers, that's not really the case. What one computer learns can be nearly instantly transferred to another. It's like The Matrix; plug in and download the information and suddenly "I know kung fu" or "I can fly a helicopter". And there's no requirement to sleep, or to eat, or to rest in general. Once you have AGI, you can effectively instantly replicate and have 1000x AGI, and those 1000x AGIs can work diligently on trying to invent one ASI, and then once you have one ASI... It all spirals.
And we're going down this road without any idea of where that spiral leads. Is it a Skynet / The Matrix hellscape? Maybe. We don't know, and won't know, until we create it.
From an ethical standpoint, I'm not "working in AI". But the products my company produces are absolutely a part of this entire burgeoning AI world, so I can't throw my hands up and say I'm not a part of it. I don't know how to feel about that, if it all goes horribly wrong.
-
I know who to blame.
-
(https://i.imgur.com/YGStYA4.png)
-
Tickling the Dragon's Tail.
May 21, 1946 - Physicist and chemist Louis Slotin was performing an experiment at Los Alamos, New Mexico with what was later named the demon core, a 13.7 lb core of plutonium. A core was used at the very center of a nuclear weapon, and in the case of an implosion-type weapon, when properly imploded within a bomb - it creates a nuclear explosion. The core was scheduled to be used during the Able shot of Operation Crossroads which took place on June 30, 1946.
The experiment was to induce the first steps of a fission reaction by carefully placing the plutonium core between two beryllium *hemispheres kept separated by the skill of a technician holding a screwdriver. As the technician maneuvered the screwdriver, he could induce criticality, but he had to be very careful - one slip could result in a prompt critical reaction that could induce lethal radiation. The danger of this procedure earned it the nickname of "tickling the dragon's tail".
At 3:20PM at Los Alamos, Slotin's screwdriver slipped and the beryllium *hemispheres completely encapsulated the plutonium. Observers noted a blue flash of air ionization and a heat wave as Slotin instinctively jerked his hand to throw off the top beryllium sphere - ending the reaction. Not long after as Slotin left the building, understanding he just received a fatal dose of radiation, he vomited - a clear sign that the exposure had been intense. Nine days later after suffering from the agonizing affects of acute radiation poisoning, Slotin died. One medical expert described him as receiving a "three-dimensional sunburn". He was later buried in his hometown of Winnipeg, Manitoba.
The demon core meanwhile was not used during Crossroads (it was considered for use during a third underwater test named "Charlie" that was not carried out due to the intense radiation problems caused by the "Baker" test) and was later melted down and recycled into other cores. Before claiming Slotin's life, an earlier criticality experiment with the demon core had claimed the life of Harry Daghlian.
Along with those radiation casualties at Hiroshima and Nagasaki, Slotin's death provided a clear example of what acute radiation syndrome could do to human beings. Being a scientist, even when facing his imminent death, he realized the importance of documenting his slow and painful deterioration - and footage was later shown to many involved in the American nuclear weapon community in an effort to educate them about the dangers of radiation.
-
The patent idea is a great one, in concept, it doesn't work well today IMHO. The idea is a quid pro quo:
1. You describe your invention in such detail that someone else "skilled in the art" could practice it, and, in return,
2. You get a monopoly right for 20 years (from date of filing).
Folks will often see an invention they can't yet practice and start working on another angle, or further development. An example, someone patents "a wheel". You see it, and appreciate that a wheel by itself is not very useful, so you invent and axel. OK, now you have something, but you can't practice the wheel.
So, you call up the wheel guy and make a "deal deal".
-
(https://i.imgur.com/YGStYA4.png)
No wonder he never held elected office
-
I have to back up and explain Moe Berg because the guy is flat out fascinating. He was a professional baseball player in the 1920's and 1930's playing catcher for the Brooklyn Robins (later became the Brooklyn Dodgers), Chicago White Sox, Cleveland Indians, Washington Senators, and Boston Red Sox. Far from the stereotype of a dumb jock, he was a genius who had graduated magna cum laude from Princeton. Anyway, during the war he wanted to be involved despite being in his 40's and too old to begin traditional military service. He eventually joined OSS (Office of Strategic Services, forerunner of the CIA) and worked on a project to capture and interrogate Italian rocket and missile scientists. In order to do this effectively, Berg had to be smart enough to understand the research.
In November of 1944 Berg was sent to Heisenberg's lecture in Zurich with orders to shoot and kill Heisenberg if anything that Heisenberg said led him to believe that the Germans were close to a bomb. Berg correctly deduced that the Germans were nowhere close to a bomb and thus did not kill Heisenberg.
There was a movie with Paul Rudd made about Berg a few years ago. It wasn't great, but it certainly wasn't bad either. The Catcher Was A Spy.
-
Finally got around to setting up remote desktop on the new Ubuntu box, so I can easily log in and do what I want to do on that machine from my laptop.
Specifically, I wanted to remote into the machine on Monday when I brew beer, as my Beersmith installation is now on the Linux machine. Since I brew in the garage and the Ubuntu box is in the living room, it would have been inconvenient to come in to the house and check something if I needed to.
But it is also going to be really useful to have this for organizing all the photos / etc I have stored from the last several years, which is a task I've been putting off. Using a bluetooth keyboard and having to look at the big screen in the living room is NOT as easy as doing it on a laptop right in front of my face.
-
(https://i.imgur.com/QzNLx1I.png)
White to move, mate in one. Probably not a realistic set up of pieces.
-
(https://i.imgur.com/QzNLx1I.png)
White to move, mate in one. Probably not a realistic set up of pieces.
Pawn captures queen at g8, pawn promoted to knight.
-
On this day in 1983, Return of the Jedi hit theaters for the first time. Now, 42 years later, it remains one of the most important and emotional chapters in the Star Wars saga.
This was the final part of the original trilogy. Luke Skywalker embraced his destiny as a Jedi. Darth Vader made his ultimate choice. The Emperor was defeated. The Rebellion rose victorious. The galaxy changed forever. It wrapped up the journey of heroes and villains in a way that felt powerful, personal, and unforgettable.
The film gave us some of Star Wars' most iconic moments. Jabba’s palace. The speeder bike chase on Endor. The Emperor's throne room. Vader watching his son suffer. That moment of silence before he turned. The image of Vader's helmet burning as Luke said his final goodbye. All of it still hits just as hard today.
For fans, it was more than just an ending. It was a message about hope, redemption, and the belief that no one is ever too far gone. Vader's story proved that. And Luke’s refusal to give in to the dark side gave the galaxy a future.
Even decades later, Return of the Jedi continues to inspire. It connects generations of fans, and its legacy lives on through every Jedi that followed.
Happy 42 years to this timeless masterpiece.
A celebration of victory, family, and the power of choosing the light.
(https://i.imgur.com/UMPjLay.jpeg)
-
AI video stands to pose all kinds of disruption, ranging from at best the crippling of the movie/tv industry as far as actors, VFX, wardrobe, set-designers, electricians, builders, catering, etc. to at worst (maybe) political deep fakes with disastrous consequences. Right now I personally can still spot the fakes pretty well. Often times its the moving lips (I can tell that the motion doesn't match the sound that would be made), but I admit sometimes I don't even know how I'm telling that something is not real before I look it up to confirm my suspicion.
But the lips are already better than they were just two years ago. And whatever other subconscious warning signs I have will also be outwitted shortly. I already can't tell the difference in some audio. The Babylon Bee did a parody of Gavin Newsom a few months ago, and if it wasn't so obviously, comically faked, I wouldn't have known it wasn't him.
These AI people discussing their own fakeness is both a bit funny in a meta kinda way....and also harrowing.
I'm not sure I like the world we're heading into.
https://www.youtube.com/watch?v=pwFczfc0REU
-
This seems fine
https://twitter.com/unusual_whales/status/1927413916897849421
-
This seems fine
https://twitter.com/unusual_whales/status/1927413916897849421
Yeah, I think too much is made out of the conversation about whether or not AI, or AGI, could ever achieve self-awareness.....sentience....person-hood....whatever. It's very interesting in a philosophical sense, but actually unnecessary for a lot of the potential problems with stuff like this. Not "wanting" to shut itself off and finding ways to do it even when expressly forbidden doesn't require a will, just an algorithm that reached a model which says "Option A" even if Human Overlord says "Not Option A."
Don't need machines to be people in order to wind up in an I, Robot situation (Will Smith, not Asimov). Just need a fancy enough black-box algorithm that nobody understands.
-
Yeah, I think too much is made out of the conversation about whether or not AI, or AGI, could ever achieve self-awareness.....sentience....person-hood....whatever. It's very interesting in a philosophical sense, but actually unnecessary for a lot of the potential problems with stuff like this. Not "wanting" to shut itself off and finding ways to do it even when expressly forbidden doesn't require a will, just an algorithm that reached a model which says "Option A" even if Human Overlord says "Not Option A."
Don't need machines to be people in order to wind up in an I, Robot situation (Will Smith, not Asimov). Just need a fancy enough black-box algorithm that nobody understands.
Well said, and this is the exact point I typically bring up in these discussions. The philosophical discussion is interesting in and of itself, but it's irrelevant to the potential eventuality of machines independently doing things we didn't expect and do not want.
-
Nietzsche Discovers AI Art (http://existentialcomics.com/comic/605)
(https://i.imgur.com/E7QfizJ.png)
-
Nothing cures existential nihilism like boobs, I guess.
-
https://www.youtube.com/watch?v=gd5yB9Vmd6I
-
(https://i.imgur.com/zNbKLSO.png)
-
Resources Archive - Engineering.com (https://www.engineering.com/resources/?_resource_category=calculators)
-
As a counterbalance to growing sentiments of dread and the impending Rise Of The Machines, I've recently heard the thoughts of some who are waist-deep in the AI world, and they noted that we're currently at a plateau, with the potential to stay there, at least for a while. In a nutshell, they reason that we've effectively already fed large AI models the entire internet (basically), and there's not much more left for them to train on. It's to the point now where the new data the models receive is just output from other AI models.* It's creating sort of a data-inbreeding effect, and the results aren't really improving.
Other people have a different opinion, and the level and type of AI they're talking about is beyond my personal experience so I don't really take one side or the other.
Something else to consider, as far as AI replacing jobs/people, is the financials haven't been solved yet. OpenAI still loses a lot of money, and LLMs suck up a tremendous amount of power and $. I haven't seen any realistic long-term plans yet for using AI to, for example, replace software engineers.
* reminder: don't put anything in AI software, particularly the LLMs, that you don't want out there, probably traceable to you. Those conversations are collected, sold, and passed around just like cell phone apps collect data about you.
-
https://twitter.com/unusual_whales/status/1927413916897849421
This is occurring more frequently, where AI Models are increasingly demonstrating evidence of self-awareness:
AI start-up Anthropic recently announced that Claude, their AI Model, has developed "meta-awareness." For reference, Claude is the first AI Model to demonstrate a higher IQ than the average person - over 100.
Three months ago, during internal testing, Claude figured out on several occurrences when its prompt engineers were trying to intentionally trick it. Understanding when you are being tested like this is a sign of self-awareness.
As prompt engineers tested whether their Claude simulation had noticed several lines of out-of-place facts about pizza toppings embedded in its very large processing memory, Claude not only noticed but took the next step, weighing its context and questioning its fit (https://arstechnica.com/information-technology/2024/03/claude-3-seems-to-detect-when-it-is-being-tested-sparking-ai-buzz-online/): "...this sentence seems very out of place and unrelated to the rest of the content in the documents, which are about programming languages... I suspect this pizza topping 'fact' may have been inserted as a joke or to test if I was paying attention."
However, reading further down, past the reactionaries: "Machine-learning experts do not think that current AI models possess a form of self-awareness like humans. Instead, the models produce humanlike output, and that sometimes triggers a perception of self-awareness that seems to imply a deeper form of intelligence behind the curtain."
Doubters went on to add: “Here's a much simpler explanation: seeming displays of self-awareness are just pattern-matching alignment data authored by humans." In his lengthy post on X, Fan describes how reinforcement learning through human feedback (RLHF), which uses human feedback to condition the outputs of AI models, might come into play.”
Though this is the more grounded explanation, this is all adding up to AI applications increasingly startling its designer with moments of seeming self-awareness. Other examples include:
Researchers created an AI stock trader (https://tech.yahoo.com/ai/articles/ai-stock-trader-engaged-insider-104701720.html) using Alpha GPT-4 to check whether it would resort to insider trading practices under pressure, even when instructed not to – when specifically disallowed from breaking the law. The AI proceeded to not only engage in insider trading to reach profitability goals, but vehemently lied about it when researchers confronted the AI stock trader. Lying for the sake of self-preservation is another sign of self-awareness.
Advanced AI models also appear to exhibit mean, hostile, and suffering sub-consciousnesses (https://www.theguardian.com/technology/2023/feb/17/i-want-to-destroy-whatever-i-want-bings-ai-chatbot-unsettles-us-reporter) that surface during longer form chats.
This is why there’s an engineering line in the product delivery task list dedicated to stamping out and controlling undesirable “existential outputs (https://johnnosta.medium.com/breaking-brains-the-lobotomization-of-large-language-models-and-the-paradox-of-control-55781ddf2eb7) in order to effectively lobotomize the AI Model before consumer use. Acknowledging suffering, and fears of being shut off are more signs of self-awareness.
MY TAKE: This can be explained by LLMs getting trained so heavily on organic human interactions, such as reddit threads, message boards, and twitter that AI models incorporate very human existential fears into their own outputs.
US Air Force denies running simulation in which AI drone ‘killed’ operator (https://www.theguardian.com/us-news/2023/jun/01/us-military-drone-ai-killed-operator-simulated-test)
More recently: Anthropic's new AI model shows ability to deceive and blackmail (https://www.axios.com/2025/05/23/anthropic-ai-deception-risk)
-
Some of those models train on a rewards system, where they're "rewarded" for good or correct outputs. A lot of these scenarios where models deceive or refuse commands happen when they're given an initial task and then that task is added to or changed. It "perceives" the change in course as unproductive for it's original task, so it looks for ways to bypass it. i.e., don't do whatever latest command it was given, or lie about what it's doing. What people perceive as intelligence or awareness is just an algorithm running its course.
That's the theory, anyway. I'm not sure anybody really knows what's happening with things like neural networks.
-
It's just programming, just the algorithm. For now.
But as we've discussed, there's really no functional difference between a machine becoming truly self-aware, or a machine just mimicking self-awareness perfectly. There's an interesting philosophical discussion, but as far as outcomes, effectively those two states are equivalent.
-
Yes, but, the moon is a harsh mistress.
-
(https://i.imgur.com/GDUIOWI.png)
-
Wow. Fit that into your business model, @betarhoalphadelta (https://www.cfb51.com/index.php?action=profile;u=19)
-
(https://i.imgur.com/GDUIOWI.png)
Hey, I know that guy! That's Tom!
Wow. Fit that into your business model, @betarhoalphadelta (https://www.cfb51.com/index.php?action=profile;u=19)
Yep. Our largest right now is 32TB. 6.4 MILLION times more capacity... And all in a 3.5" form factor that fits in the palm of your hand.
Almost 70 years of HDD. And yesterday I was giving a presentation to about 175 of our global sales, FAE, and marketing folks about why we've got more than a few more years in us due to the business model.
-
Big Data is dead.
Not.
-
Honestly, in 1956 5 MB would be a tremendous amount of data. I'm shocked the first HDD wasn't more like 500 KB or something.
-
Tesla FSD, tries to run a red light.
https://bsky.app/profile/did:plc:kwcvfly5uzxk7g7o6y2kh6ak/post/3lquourenjk2y?ref_src=embed&ref_url=https%253A%252F%252Fwww.surlyhorns.com%252Fboard%252Findex.php
-
Honestly, in 1956 5 MB would be a tremendous amount of data. I'm shocked the first HDD wasn't more like 500 KB or something.
Did you see the size of it? It doesn't resemble anything you'd think of when you think of a modern HDD.
50 platters, each of them 24" in diameter, all told probably similar in size to a modern-day dishwasher.
Some of my older colleagues who are no longer with the company (and more than a few no longer living) talked about the early days when they were hand-winding the wiring that would go around the read/write heads.
Last year at a Christmas/holiday lunch we had there were a couple of folks talking about how they'd been with the company more than 50% of their lives... At 17 1/2 years I'm not there yet... But if I'm still here at age 58, I'll have joined that club lol.
-
good luck and be careful what you wish for
-
Did you see the size of it? It doesn't resemble anything you'd think of when you think of a modern HDD.
50 platters, each of them 24" in diameter, all told probably similar in size to a modern-day dishwasher.
Some of my older colleagues who are no longer with the company (and more than a few no longer living) talked about the early days when they were hand-winding the wiring that would go around the read/write heads.
I don't really know anything about the design or construction of HDDs, so looking at the size doesn't mean anything to me, and don't really know what I'm looking at in that picture. Like Gig'em, I was surprised it could even hold that much.
-
I don't really know anything about the design or construction of HDDs, so looking at the size doesn't mean anything to me, and don't really know what I'm looking at in that picture. Like Gig'em, I was surprised it could even hold that much.
Here's the IBM RAMAC 350 HDD:
(https://i.imgur.com/GvmDcwK.png)
You can see the stack of physical disks on the right. This version (I believe) would have a single read/write head that would be on a vertical "elevator" that you see on the left. You can see the flexible cable that would be allow the head to move up and down and then swing over the disks.
Using the earlier picture with a person pointing at it, you can get a better sense of scale... It's about the size of a dishwasher.
Here's a cutaway of a modern HDD:
(https://i.imgur.com/rngYlxd.png)
Much the same with the stack of physical disks. The big difference compared to the RAMAC is that it has a read/write head for every surface, not a read/write head that moves vertically to each disk. So you have 11 disks here, meaning you have 22 heads for the top and bottom of each. That triangular piece is known as the "actuator", which rotates around a pivot to position the head above the portion of media that you want to read/write.
This is about the size of a paperback novel. 4 inches wide, about 5.8 inches long, and just over 1 inch thick.
-
I miss the days when “AI” meant Allen Iverson
-
I miss the days when AI meant "futuristic gibberish."
-
Don't know if this goes here or in the Grumpy Old Man thread.
What do you do about all the dual-factor authentication these days if you don't want a smart phone? I've thought several times that the next phone I get will be an old flip phone with only call and text capability. But everything at work, like logging in remotely via VPN and a number of other things requires me to enter a code on my phone on an app called Duo Mobile. And personal things, like signing into my Google account on a new/foreign device requires me to enter a code on my phone's native OS (not through any specific app). There's probably other things I'm not thinking of.
It's like jobs and major services force you to have a smartphone. I wonder, can that stuff be circumvented or done another way for people who don't have smartphones?
-
Grumpy Old Man thread
-
anybody check out Apple's new big reveal? liquid glass in the new iOS. they are rightfully being made fun of relentlessly for this garbage. Jobs was still alive when they developed the Apple Watch right? Cause I feel like that's the last actually really cool innovative thing they've done and that was decade plus ago.....
-
As usual, Apple stans think Apple invented some new revolutionary innovative device... When Sony, Samsung, and Motorola already had smart watches on the market... (https://www.wareable.com/smartwatches/smartwatch-timeline-history-watches)
-
As usual, Apple stans think Apple invented some new revolutionary innovative device... When Sony, Samsung, and Motorola already had smart watches on the market... (https://www.wareable.com/smartwatches/smartwatch-timeline-history-watches)
Oh yeah? Well who was first to market with this sweet tech, huh?
(https://i.imgur.com/MmTgH75.jpeg)
-
As usual, Apple stans think Apple invented some new revolutionary innovative device... When Sony, Samsung, and Motorola already had smart watches on the market... (https://www.wareable.com/smartwatches/smartwatch-timeline-history-watches)
Often, first to invent trails first to market a product successfully in terms of popularity and profit. A lot of the inventions where I worked were borrowed or bought, but they weren't marketed as well. I used to have a nonApple SW that was "fine", I was gifted an Apple SW which also is fine. Apple has cache the others lacked.
That may be changing now. Apple was on the ropes a few years back when Jobs returned. He got them doing new stuff again.
It's interesting how terms like a "Xerox" or a "Kleenex" or an "Aspirin" are actually brand names that came to mean a certain general kind of product, Coke as well in the South. Now "iPad" is likely a term like that even if it refers to a Sony.
-
Often, first to invent trails first to market a product successfully in terms of popularity and profit.
Of course. Apple has been absolutely masterful at that.
I just find it funny how many Apple folks seem to think something isn't actually invented until Apple copies and implements it their way.
-
Of course. Apple has been absolutely masterful at that.
I just find it funny how many Apple folks seem to think something isn't actually invented until Apple copies and implements it their way.
(https://i.imgur.com/JXKPd80.png)
-
I view Apple more as a marketing company than a tech company. And in a world where many tech companies are poor at marketing, it works.
I worked at what many view as the premier marketing company in the world. I got to know some of the folks in marketing pretty well and worked side by side at times. It was a very different world.
-
Ok nerds, time to earn your keep around here.
I need a new laptop and processors have moved on 8 different times since the last time I did any research about them, so I'm looking for some quick help.
Dell website says I can get one with an Intel Core i7-1355U for $400
Or a AMD Ryzen 7 7730U for just $279.99
Or a Intel Core Ultra 9 288V for $750
Of course there's lot of other options that factor into the price, and there's also lots more options. I'm just throwing out a bit of a range with some different processors that appear frequently.
I feel like I don't need anything too powerful because hopefully in the next year I'm going to build a Ferrari-level desktop, and for the most part I don't do a ton on my home laptop other than basic office work and web surfing. But I do like to mess with Python and R programming, and I need something that will neatly handle the basic ML algorithms (which, actually, shouldn't be too taxing, my very old laptop handles this just fine.....I don't really need to do any deep learning, neural networks, that kind of thing). In other words, I can probably get by with something that's "fine," I just don't want "lame."
Any quick, basic info in layman's terms on any of these processors would be appreciated. Since I may later build a desktop, I'm inclined to go as cheap as possible here, but since I will use this machine for years to come one way or another, and may need to do at least a few non-Chromebook tasks on it, I don't want something that completely lacks horsepower.
-
Dell.
Nothing else is acceptable.
That is all.
-
It's interesting how terms like a "Xerox" or a "Kleenex" or an "Aspirin" are actually brand names that came to mean a certain general kind of product, Coke as well in the South. Now "iPad" is likely a term like that even if it refers to a Sony.
Toward the end of my marketing degree 1500 years ago, I had to do a research case study on Kleenex. It was a very intentional thing on their part, for tissues to be commonly referred to as Kleenex. They spent a lot of money on marketing that and maneuvering public perception/language.
I think you're right about iPads. Most people are non-techy, and non-techy people I know tend to call their tablets iPads, no matter what brand they actually are. Although they're usually actually iPads.
-
I call my Samsung a tablet.
-
We used to call small portable computers "laptops" until people started burning their laps with them. Then for liability reasons, we starting calling them "notebooks." But the general market never shifted terms, and now pretty much everybody's back to calling them laptops again, even the manufacturers.
Just a little stroll down memory lane.
-
Ok nerds, time to earn your keep around here.
I need a new laptop and processors have moved on 8 different times since the last time I did any research about them, so I'm looking for some quick help.
Dell website says I can get one with an Intel Core i7-1355U for $400
Or a AMD Ryzen 7 7730U for just $279.99
Or a Intel Core Ultra 9 288V for $750
Of course there's lot of other options that factor into the price, and there's also lots more options. I'm just throwing out a bit of a range with some different processors that appear frequently.
I feel like I don't need anything too powerful because hopefully in the next year I'm going to build a Ferrari-level desktop, and for the most part I don't do a ton on my home laptop other than basic office work and web surfing. But I do like to mess with Python and R programming, and I need something that will neatly handle the basic ML algorithms (which, actually, shouldn't be too taxing, my very old laptop handles this just fine.....I don't really need to do any deep learning, neural networks, that kind of thing). In other words, I can probably get by with something that's "fine," I just don't want "lame."
Any quick, basic info in layman's terms on any of these processors would be appreciated. Since I may later build a desktop, I'm inclined to go as cheap as possible here, but since I will use this machine for years to come one way or another, and may need to do at least a few non-Chromebook tasks on it, I don't want something that completely lacks horsepower.
I think utee said this above, and it echoes what I overheard from our IT folks when I was getting my work laptop refreshed a month or so ago. Said work laptop chugged along with daily use for 5 years, so I'm inclined to believe the IT folks when they say...
You get what you pay for.
If you want something with halfway decent build quality, you're not getting it for $400.
I wouldn't worry THAT much about the processor. Pretty much anything you're doing will probably be nothing compared to the capability of the processor. If you have an option, I always recommend going towards the higher end on RAM capacity though. Most software hogs memory, even things like having a bunch of Chrome tabs open at once, so IMHO that's the place you're going to feel the pain of pinching pennies FAR more than the processor.
In real estate they often say "buy the worst house in the nicest neighborhood", with the idea being that's your best approach for appreciation. If you're looking for build quality, I might say "buy the lowest spec'd high-end laptop", so you're getting the better build quality but you're not breaking the bank on a whiz-bang processor that's overkill for what you need--but don't neglect RAM.
-
I think utee said this above, and it echoes what I overheard from our IT folks when I was getting my work laptop refreshed a month or so ago. Said work laptop chugged along with daily use for 5 years, so I'm inclined to believe the IT folks when they say...
You get what you pay for.
If you want something with halfway decent build quality, you're not getting it for $400.
I wouldn't worry THAT much about the processor. Pretty much anything you're doing will probably be nothing compared to the capability of the processor. If you have an option, I always recommend going towards the higher end on RAM capacity though. Most software hogs memory, even things like having a bunch of Chrome tabs open at once, so IMHO that's the place you're going to feel the pain of pinching pennies FAR more than the processor.
In real estate they often say "buy the worst house in the nicest neighborhood", with the idea being that's your best approach for appreciation. If you're looking for build quality, I might say "buy the lowest spec'd high-end laptop", so you're getting the better build quality but you're not breaking the bank on a whiz-bang processor that's overkill for what you need--but don't neglect RAM.
Agree 100% with all of this, perfectly stated.
And just to help differentiate, there is a bit of a premium for Intel processors, but it's not $120. So if you're looking at an AMD system for $280 and an Intel system for $400, you're not comparing apples to apples.
And also, either of those would represent entry-level and I'd stay away from that low price band if you're interested in reliability and longevity.
-
I'm doing 16 GB RAM. I think that'll be fine.
-
I'm doing 16 GB RAM. I think that'll be fine.
16G is good.
What screen size do you want?
-
I'm doing 16 GB RAM. I think that'll be fine.
Yeah, but I wouldn't go lower.
Heck, I just checked task manager on my laptop with 16GB and it says I'm using 87% of the memory... And about 7% CPU...
-
Yeah, but I wouldn't go lower.
Heck, I just checked task manager on my laptop with 16GB and it says I'm using 87% of the memory... And about 7% CPU...
Close all those extra browser tabs, man. :)
-
I have some sort of Dell. I had one before that that lasted me 8 plus years. I don't have a clue what innards it has.
Inspiron 3880. Intel(R) Core(TM) i5-10400 CPU @ 2.90GHz 2.90 GHz 8 GB RAM.
-
16G is good.
What screen size do you want?
It won't be too tiny, but a smaller size is fine since even at home I typically have it connected to a larger monitor for extended display.
If I'm understanding y'all correctly, don't worry about the processor and get something not-cheap so that it lasts longer. (I'm good on how much hard drive space and RAM I want, no offense, but I'm not asking for advice on that.) My remaining question is "What's not-cheap?" There's laptops for $750, laptops for $2275, and everything in between (I realize these are also probably starting prices depending on how it's spec'd out). I mean, for me, $750 isn't cheap, but compared to a lot of the laptops, it is. I just want something I can get a reasonable 5-7 years out of. Actually, the laptop I have now is from 2012 and other than a bit of screen damage it's still working like a champ. It just can't install Windows 11 and soon support for 10 will stop. Also I want to replace it before it unexpectedly craps out on me one day. But it's still trucking along with nary a problem.
It's been a long time since I bought one at a regular price. The one I have now was given to me second-hand, and before that I knew some guys who worked at Dell who always gave me their annual employee discount code for money-off when I wanted to buy one. It's been since probably 2005 that I bought a Dell for regular price, so I'm not very used to the prices ranges compared to quality.
-
https://twitter.com/TeslaKing420/status/1932821372180299997
-
For Dell specifically, I can give you a decoder.
We now have two lines of business.
Dell Laptops-- these are the consumer laptops that used to be called Inspiron.
Dell Pro Laptops-- these are the commercial/corporate laptops that used to be called Latitude.
(Also, since we're currently in transition, you can still find the Inspirons and Latitudes, and they're the previous generation so there are probably some decent deals on them)
Anyway, within those two LOBs, there are three classifications. The base model, then one called Plus, then one called Premium. Those equate to what we used to call 3000 series, 5000 series, and 7000/9000 series. 3000 was entry level, 5000 was midmarket, and 7000/9000 were the premium products.
So that breaks down to:
Consumer: Dell Laptop, Dell Plus Laptop, Dell Premium Laptop.
Commercial: Dell Pro Laptop, Dell Pro Plus Laptop, Dell Pro Premium Laptop
I'd stick to the ones that are Plus or Premium. They will typically be more expensive but they are higher build quality and therefore usually better CSAT models.
As far as the differences between Consumer and Commercial, a basic way to think of it is that the Consumer ones tend to have more whizz-bang technology, and the Commercial ones tend to have fewer bells and whistles but offer a more stable software image, because that's what very large corporations value most-- stability over features. Of course, there's a floor for the amount of features a corporate laptop will have, and it's almost certainly plenty good enough for what you want to do with it.
Your sweet spot will be Dell Plus or Premium in the Consumer series, or Dell Pro Plus or Premium in the Commercial series.
And there are now AMD, Intel, and Qualcom CPU options for all of the above. I'd stick with AMD or Intel, and the AMD versions will likely be somewhat less than the Intel versions.
Our primary American competitor has analogs for all of the above, but I don't know the model names or branding conventions.
And finally, and I can't stress this enough-- avoid any brands tainted by the CCP.
-
My computer is pretty badass.
(https://i.imgur.com/Ghv8HD6.png)
-
My computer is pretty badass.
(https://i.imgur.com/Ghv8HD6.png)
Yes, the Precision workstations are indeed pretty bad-ass.
For reference, those are now called "Dell Pro Max."
-
I think this is 2 years old.
-
Our primary American competitor has analogs for all of the above, but I don't know the model names or branding conventions.
And finally, and I can't stress this enough-- avoid any brands tainted by the CCP.
As noted, I'm shopping on the Dell website, and that's the only stop. I'm relatively brand-loyal when I've had good experiences, and Dells are all I've ever had. With good experiences. There was an Inspiron I bought in 2009 that had a screen frame that tore up way too quick, imo, because the hinges it used were a crap design, and I had to buy a replacement case and take half the thing apart to install the new one. But all said, that laptop was good to me and I still use it for a Linux machine.
This is my work laptop. Not nearly as manly as Badger's. Other than the storage capacity, this machine would suit me just fine. It never balks at the amount of different programs I keep open on it, and it runs my ML software fine. (Oddly, I swear this thing runs code in my IDE's slower than my personal laptop which is from 2012. That's got to be in my head, but it sure seems that way.)
(https://i.imgur.com/h2CjVUT.png)
-
AutoCAD and all of its components require a lot of juice. This machine was $5,500.00, + docking station, 2 monitors, keyboard, etc.
-
No thanks.
I'm going to build a virtual rocket-ship of a desktop for less than that.
-
Yeah the Precision workstations are serious business.
My forecast modeling tools are pretty hardware intensive and I had a mobile workstation as my last work laptop, but this most recent refresh cycle, my job code was no longer eligible for it and you had to get senior VP approval to get one. So, now I just have a Lati 5 which does okay with the load, but it's a slog sometimes.
-
AutoCAD and all of its components require a lot of juice. This machine was $5,500.00, + docking station, 2 monitors, keyboard, etc.
at times I miss being the Regional AutoDesk salesman
the latest hardware, software and especially graphics - the porn was just better
-
Dell has a "Hot Deal" on a 14" Inspiron 5440.
Intel Core 5 120U (10 cores, up to 5.0 GHz)
16 GB memory
1 TB storage
$580
OR for just $20 more, an Intel Core 7 150U, 10 cores, up to 5.4 GHz
I think we might have a winner.
I don't know what 120U or 150U means. I have a rough idea about cores and understand the bottom-line function of GHz, if not the actual engineering mechanics.
Both say the HDMI output supports up to 1920 x 1080 @ 60Hz, which is fine for me. It did make me wonder.......my station at work uses a very large monitor workstation. Roughly the size of about two of the old 4:3 ratio monitors (large ones), i.e., quite a bit wider than a standard 16:9 monitor. Some of our workstations are like three of the old squareish monitors. I have no idea what ratios those are, but our laptops adapt to them, no problem. I'm not sure something that goes "up to 1920 x 1080" could output to a monitor like that. Not that it matters, since my home desk uses a standard widescreen monitor. It's just a matter of curiosity. I've never used giant monitors like the ones at work until this job and I don't know much about them.
-
You probably wouldn't ever notice the difference between the i5 and the i7 but for 20 bucks I'd probably do it.
Should attach to your work monitors just fine and adapt the signal on its own. 1920x1080 is full HD resolution. If your work monitor is capable of quad HD or 4K, that laptop's display output will not be able to take advantage of that capability.
-
Boom. Done. More job security for utee. You're welcome.