Those GUNS…

Meet Mike, my latest YouTube obsession… for obvious reasons. (And he can be spotted sporting a rainbow Apple Watch face in nearly all of his videos!)

Actually, his videos remind me of the very unpleasant history of early PCs that launched me on my career trajectory those many years ago. Looking back, it truly was stone knives and bear skins in comparison to today. MFM, RLL, selecting IRQs, terminating resistors; the crap we had to deal with! But at least we were treated like gods—or at least like first responders—for understanding how it all worked and getting the shit working again when it stopped.

Now it seems we're viewed as just janitors, cleaning up everyone else's mess because they're too intellectually lazy to even try and figure anything out on their own.

This Was an Accident

As you know, I've been participating in Apple's Public Beta programs for years. Early on (I think it was pre-Mavericks, actually) I learned the hard way that you do NOT install a beta on your main daily driver. Since that unfortunate mishap, I haven't stopped installing betas; I've just learned how to do it safely. Namely, by either installing on an external drive, or on a wholly separate partition on the main drive. The latter has been my preferred method for the last several iterations. And even then, I don't usually jump into the fray until Beta 3 or 4.

Well imagine my surprise when I saw the new MacOS Sonoma showing up as an available update in my System Preferences today. I guess you don't even need to be a member of the cordoned-off Public Beta program any more and—if you're foolhardy enough—to delve right into the Developer Beta Universe.

Okay, I thought. Why not? I'll create a separate partition and direct the installer to go there. Beta 1 is going to be fraught with danger, but it's safely tucked away from my critical data so what's the harm?

Well, it turns out this beta (and perhaps all developer releases?) doesn't follow the normal install routine. I clicked "upgrade" and it did not offer the customary screen prompting me where I'd like to install it. It simply rebooted and started installing.

Oh shit, I thought. Thankfully, I had last night's full-disk backup so I knew if I had to go back to Ventura I could. It wouldn't be pretty, and it would take several hours to reinstall the OS over the air and then restore all my data, but at least I had that safety net.

After about 20 minutes, the Sonoma install completed and brought me to the log in screen. "So far, so good," I thought. I logged in and everything came up normally.

Except…I had no internet connectivity. It showed I was connected via WiFi to my home network, but whenever I tried to go anywhere online I got notice that "You are not connected to the internet." I tried connecting via my phone's hotspot. I tried via the Cox shared network point. Nothing.

I turned off my VPN. I turned off the firewall. I turned off my ad blockers. Still nothing. Of course, there was not much info on the web about this yet, so after screwing around with it for about a half hour, I said fuck it, rebooted, wiped the drive, and two hours later Ventura was back up and running.

Today's Apple Product Announcement

For me, the creepiness factor of Apple's "Vision Pro" is off the charts. It's not just the look, but all the personal biomatric data it collects and processes—ostensibly in the name of maintaining your personal security.

While it's still a far cry from the device described in the final paragraphs below, I can easily see it ending up there at some point in the not-too-distant future.

A post I made in 2017 (skip to the "Brain Waves" section toward the end if you want to skip all the stereo geek stuff):

The Future of High Fidelity

I was cleaning stuff out over the weekend and ran across a file folder full of clippings I'd kept from various sources over the years.  I was a big hi-fi geek in high school and college, and one of the articles I kept that I'd always loved was a bit of fiction from the mind of Larry Klein, published July 1977 in the magazine Stereo Review, describing the history of audio reproduction as told from a future perspective.  Since the piece was written many years in advance of the personal computer revolution, the author was wildly off-base with some of his ideas, but others have manifested so close in concept—if not exact form—that I can't help but wonder if many young engineers of the day took them to heart in order to bring them to fruition.

And I would be very surprised indeed if one or more of the writers of Brainstorm had not read the section on neural implants, if only in passing…

Two Hundred Years of Recording

The fact that this year, 2077, is the Bicentennial of sound recording has gone virtually unnoticed. The reason is clear: electronic recording in all its manifestations so pervades our everyday lives that it is difficult to see it as a separate art or science, or even in any kind of historical perspective.  There is, nevertheless, an unbroken evolutionary chain linking today's "encee" experience and Edison's successful first attempt to emboss a nursery rhyme on a tinfoil-coated cylinder.

Elsewhere in this Transfax printout you will find an article from our archives dealing with the first one hundred years of recording. Although today's record/reproduce technology has literally nothing in common with those first primitive, mechanical attempts to preserve a sonic experience, it is instruction from a historical and philosophical perspective to examine the development of what was to become known as "high fidelity."

Primitive Audio

It is clear from the writings of the time that the period just after the year 1950 was the turning point for sound reproduction. For a variety of sociological, economic, and technological reasons, the pursuit of accurate sound reproduction suddenly evolved from the passionate pastime of a few engineers and Bell Laboratories scientists into a multimillion-dollar industry. In the space of only fifteen years, "hi-fi" became virtually a mass-market commodity and certainly a household term. In the late 1970s, the first primitive microprocessors (miniature computer type logic-plus-memory devices) appeared in home audio equipment. These permitted the user to program was was known as an "FM tuner," record player," or "tape recorder" to follow a certain procedure in delivering broadcast or recorded material.

For those who are not collectors of those antique audio devices, which employed "records" or "tapes," such terms require explanation. From its earliest beginning, recording employed an analog technique. This means that whatever sound was to be preserved and subsequently reproduced was converted to an equivalent corresponding mechanical irregularity on a surface. When playback was desired, this irregularity was detected or "read" by a mechanical sensing device and directly (later, indirectly) reconverted into sound. It may be difficult to believe, but if, say, a middle-A tone (which corresponds to air vibrating at a rate of 440 times per second) was recorded, the signatl would actually consists of a series of undulations or bumps which would be made to travel under a very fine-pointed stylus at a rate of 440 undulations per second. Looking back from a present-day perspective, it seems a wonder that this sort of crude mechanical technique worked at all—and a veritable miracle that it worked as well as it did.

The End of Analog

Magnetic recording first came into prominence in the 1950s. Instead of undulations on the walls of a groove molded in a nominally flat vinyl disc, there were a series of magnetic patterns laid down on very long lengths of of thin plastic tape coated on one side with a readily magnetized material. However, the system was still analog in principle, since if the 440-Hz tone was magnetically recorded, 440 cycles of magnetic flux passed by the reproducing head in playback. All analog systems—no matter what the format—suffered the same inherent problem (susceptibility to noise and distortion), and the drive for further improvement caused the development of the digital audio recorder.

Simply explained, the digital recording technique "samples" the signal, say, 50,000 tiles a second, and for each instant of sampling it assigns a digitally encoded number that indicates the relative amplitude of the signal at that moment. Even the most complex signal can be assigned one number that will totally describe it for an instant in time if the "instant" chosen is brief enough. The more complex the signal, the greater the number of samples needed to represent it properly in encoded form.

In the late Seventies and earl Eighties, digital audio tape recording proliferated on the professional level, and slightly later it also became standard for the home recordist. Many of the better home videotape recording systems were adaptable for audio recording; they either came with built-in video-to-audio switching or had accessory converters available.

The video disc, first announced in the late 1960s, progressed rapidly along its own independent path, since it benefited from many of the same technical developments as the other home video and digital products. B the mid 1980s a variety of video-disc player were available that, when fed the proper disc, could provide both large-screen video programs with stereo sound or multichannel audio with separate reverb-only channels. The fat semiconductor RV screen that was available in any size desired appeared in the early 1980s. It was the inevitable outgrowth of the light-emitting diode (LED) technology that provided the readouts for the electronic watches and calculators that were ubiquitous during the early 1970s. Later in the decade, giant-screen home video faced competition from holographic recording/playback technique. Whether the viewer preferred a three-dimensional image than was necessarily limited in size and confined (somewhat) in spatial perspective or a life-size two-dimensional one ultimately came down to the specifics of the program material. In any case, the two non-compatible formats competed for the next twenty years or so.

LSI, RAM, and ROM

By the late 1980s, the pocket computer (not calculator) had become a reality. Here too, the evolutionary trend had been clearly visible for some time. The first integrated circuits were built in the late 1950s with only one active component per "chip." By the end of the Seventies, some LSI (large-scale integrated circuit) chips had over 30,000 components, and RAM (random-access memory) and ROM (read-only memory) microprocessor chips became almost as common as resistors in the hi-fi gear of the early 1980s. ADC's Accutrac turntable (ca. 1976) was the first product resulting from (in their phrase) "the marriage of a computer and an audio component." The progeny of this miscegenation was the forerunner of a host of automatic audio components that could remember stations, search out selections, adjust controls, prevent audio mishaps, monitor performance, and in general make equipment operation easier while offering greater fidelity than ever before. As a critic of the period wrote, "This new generation of computerized audio equipment will take care of everything for the audiophile except the listening." Shortly thereafter, the equipment did begin to "listen" also, and soon any audiophile without a totally voice-controlled system (keyed, of course only to his own vocal patterns) felt very much behind the times. One could also verbally program the next selection—or the next one hundred.

"Resident" Computers

The turn of the century saw LSI chips with million-bit memories and perhaps 250 logic circuits—and the eruption of two controversies, one major and one minor. The major controversy would have been familiar to those of our ancestors who were involved in the cable-vs-broadcast TV hassles during the 197os and later. The big question in the year 2000 was the advantage of "time sharing" compared with "resident" computers for program storage.

Since the 1950s the need for fast out put and large memory-storage capacity had drien designers into ever more sophisticated devices, most of them derived from fundamental research in solid-state physics. The late 1970s, a period of rapid advances, saw the primitive beginnings of numerous different technologies, including the charge-transfer device (CTD), the surface acoustic-wave deivce (SAW), and the charge-coupled device (CCD), each of which had special attributes and ultimately was pressed into the service of sound reproduction processing and memory. The development of the technique of molecular-beam eipitaxy (which enabled chips to be fabricated by bombarding them with molecular beams) eventually led to superconductor (rather than semiconductor) LSIs and molecular –tag memory (MTM) devices. Super-fast and with a fantastically large storage capacity, the MTM chips functioned as the heart of the pocket-size ROM cartridge (or "cart" as it was known) that contained the equivalent of hundreds of primitive LP discs.

The read-only memory of the MT carts could provide only the music that had been "hard-programmed" into them. This was fine for the classical music buff, sicne it was possible to buy the complete works of, say, Bach, Beethoven, and Carter in a variety of performances all in one MT cart and still have molecules left over for the complete works of Stravinsky, Copland, Smythe, and Kuzo. However, anyone concerned with keeping his music library up to date with the latest Rama-rock releases or Martian crystal-tone productions obviously needed a programmable memory. But how would the new program get to the resident computer and in what format?

By this time, every home naturally had a direct cable to a master time-sharing computer whose memory banks were contantly being updated with the latest compositions and performances. That was just one of its minor facilities, of course, but music listeners who subscribed to the service needed only request a desired selection and it would be fed and stored in their RAM memory units. Those audiophiles who derived no ego gratification from owning an enormous library of MT carts could simply use the main computer feed directly and avoid the redundance of storing program material at home. Everyone was wired anyway, directly, to the National Computer by ultra-wide-bandwidth cable. The cable normally handled multichannel audio-video transmissions in addition to personal communication, bill-paying, voting, etc., and, of course, the Transfax printout you are now reading.

Creative Options

The other controversy mentioned, a relatively minor one, involved a question of creative aesthetics. The equalizers used by the primitive analog audiophiles provided the ability to second-guess the recording engineers in respect to tonal balance in playback. This was child's play compared with the options provided by computer manipulation of the digitally encoded material. Rhythms and tempos of recorded material could easily be recomposed ("decomposed" in the view of some purists) to the listeners' tastes. Furthermore, one could ask the computer to compose original works or to pervert compositions already in its memory banks. For example, one could hear Mongo Santamaria's rendition of Mozart's Jupiter Symphony or even A Hard Day's Night as orchestrated by Bach or Rimsky-Korsakov. The computer could deliver such works in full fidelity—sonic fidelity, that is—without a millisecond's hesitation.

Since Edison's time, the major problems of high fidelity have occurred in the interface devices, those transducers that "read" the analog-encoded material from the recording at one end of the chain or converted it into sound at the other. Digital recording, computer manipulation of the program material, and the MT memory carts solved the pickup end of the problem elegantly; however, for decades the electronic-to-sonic reconversion remained terribly inexact, despite the fact that it was known for at least a century that the core of the problem lay in the need to overlay a specific acoustic recording environment on a nonspecific listening environment. Techniques such as time-delay reverb devices, quadraphonics, and biaural recording/playback, which put enough "information" into a listening environment to override, more or less, the natural acoustics, were frequently quite successful in creating an illusion of sonic reality. But it continued to be very difficult to establish the necessary psychoacoustic cues. The problem was soluble, but it was certainly not easy with conventional technology. And the necessary unconventional technology appeared only in the early years of this century.

Brain Waves

It has long been known that all the material fed to the brain from the various sense organs is first translated into a sort of pulse-code modulation. But it was only fifty years ago that the psychophysiologists  managed to break the so-called "neural code." The first applications of the neural-code (NC) converters were, logically enough, as prosthetic devices for the blind and deaf. (The artificial sense organs themselves could actually have been built a hundred years ago, but the conversion of their output signals to an encoded form that the brain would accept and translate into sight and sound was a major stumbling block.)

The NC (encee) converter was fed by micro-miniature sensors and then coupled to the brain through whatever neural pathways were available. Since rather delicate surgery was required to implant and connect the sensory transducer/converter properly, the invention of the Slansky Neuron Coupler was hailed as a breakthrough rivaling the original invention of the neural code converter. The Slansky Coupler, which enabled encoded information to be radiated to the brain without direct connection, took the form (for prosthetic use) of a thin disk subcutaneously implanted at the apex of the skull. Micro-miniature sensors were also implanted in the general location of the patient's eyes or ears.  Total surgery time was less than one hour, and upon completion the recipient could hear or see at least as well as a person with normal senses.

What has all this to do with high-fidelity reproduction? Ten years ago a medical student "borrowed" a Slansky device and with the aid of an engineer friend connected it to a hi-fi system and then taped it to his forehead. Initially, the story goes, the music was "translated"—"scrambled" would be more accurate—into color and form and the video into sound, but several hundred engineering hours later the digitally encoded program and the Slansky device were properly coupled and a reasonable analog of the program was direcly experienced.

When the commercial entertainment possibilities inherent in the Slansky Coupler became evident, it was only a matter of time before special program material became available for it. And at almost every live entertainment or sports event, hi-fi hobbyists could be seen wearing their sensory helmets and recording the material. When played back later, the sight and sound fed directly to the brain provided a perfect you-are-there experience, except that other sensory stimuli were lacking. That was taken care of in short order. Although the complete sensory recording package was far too expensive for even the advanced neural recordist, "underground" cartridges began to appear that provided a complete surrogate sensory experience. You were there—doing, feeling, tasting, hearing, seeing whatever the recordist underwent. The experience was not only subjectively indistinguishable from the real thing, but it was, usually, better than life. After all, could the average person-in-the-street ever know what it is to play a perfect Cyrano before an admiring audience or spend an evening on the town (or home in bed) with his favorite video star?

The potential for poetry—and for pornography—was unlimited. And therein, as we have learned, is the social danger of the Slansky device. Since the vicarious thrills provided by the neural-code-converter/coupler are certainly more "interesting" than real life ever is, more and more citizens are daily joining the ranks of the "encees." They claim—if you can establish communication wit them—that life under the helmet is far superior that that experienced by the hidebound "realies." Perhaps they are right, but the insidious pleasures of the encee helmet has produced a hard core of dropouts from life far exceeding in both number and unreachablility those generated by the drug cultures of the last century. And while the civil-liberties and moral aspects of the matter are being hotly debated, the situation is worsening daily. It is doubtful that the early audiophiles ever dreamed that the achievement of ultimate high-fidelity sound reproduction would one day threaten the very fabric of the society that made it possible.

OMG…

This whole Tiktok hearing in Congress reinforces my belief in that we need an entry requirement for politics (just like any company out there) to ensure better and smarter politicians lead the country. Basic knowledge of the internet and technology in general should be a requirement. Some of the questions during the hearing were so elementary and mind-numbingly stupid.

 

I Had One of These

The second-generation Sony D-100 was my second portable CD player. The first was a first-generation Sony D-7. Being an early adopter, I suffered through several problems with the D-7, but it was worth it to have "CD Quality" sound with me wherever I went.

The D-100, however, was much more reliable and was my constant companion for half a dozen years or so. The only problem it had was a dodgy headphone jack that kept coming unsoldered from the main circuit board, necessitating numerous self-repairs over the lifespan of the unit.

I don't remember what ultimately happened to either player, but my need for a portable CD player was eventually usurped when I got into Minidisc in the late 90s and the early 00s; a technology vastly superior for portable music, but ultimately made obsolete by the iPod.

Le Sigh

This Sony model was the last portable Minidisc player I owned. I got big into MD in the late 90s/early 00s. I had [more than one] MD deck, a MD player in my car, and of course, various portables. I hung onto the format until I got my first iPod and when I started using iTunes, I knew MD—as wonderful a format as it was—was dead. The hardware was awesome, but Sony's software was absolute shit. I eventually sold off all my gear and the hundreds of disks I'd amassed, never looking back.

Every now and then, however, an image like this crosses my path and I just sigh.

Perhaps not surprisingly, Ben was also into MD. I mean, what are the chances of that?

The Vagueries of Browsers

I have—or should I say I had—four browsers loaded on my laptop. Why, you ask? Because NONE OF THEM work consistently with all websites I visit.

Being a Mac user, for the longest time I was using Safari exclusively. I like how it automatically offers to pop in those two-factor authentication numbers that come in via text when you're logging into certain websites—because, of course, it's integrated with Apple Messages.

For a while I was using the new Microsoft Edge. It's built on Chrome, but it doesn't seem to be the annoying memory hog (or data harvester) that Chrome has proven itself to be. It also allows me to use certain Chrome extensions (like ad blockers) that Safari does not.

Many years ago I used to be a Firefox apostle, but I found it refused to play nice with—or in some cases even open—certain websites. I finally abandoned it for that reason. Too much of a hassle. (Come to think of it, I had the same issue with Safari years ago and that's what led me to Chrome.)

I was discussing these browser issues with a friend of mine a few weeks ago and he said he was firmly in the Firefox camp and found it incredulous that I wasn't—if only because of the built-in security in the browser. He said most of the issues I'd been complaining about from years past had been long since resolved and that I should give it another try.

I took him up on his suggestion, and have been happy with the decision. While it doesn't meet all of my needs consistently, it does it enough that I removed Edge and Chrome.

What led me to this post, however, is a quirk I noticed when posting video clips. In Safari, Chrome, and Edge, when I post videos, they just appear as black boxes with the play button in the center. There's no preview, or "poster image" displayed, necessitating the need for me to add one manually in WordPress.

Firefox, however, displays the videos with the preview already in place, not only when I'm composing the posts, but also when they're published—something none of the other browsers did.

Granted, I still have to manually add a poster image for the maximum compatibility for my readers, but it's still cool to know that it's not needed if you're using Firefox exclusively.

Disappointed

Like millions of other Apple aficionados, on Monday I watched the WWDC broadcast.

And like I do almost every time I watch one of these things lately, I came away disappointed.

While I am not in the market for a new Mac—and not a MacBook Air by any means—I was still very much looking forward to seeing the array of fun new colors that were supposedly slated for this major redesign of the iconic laptop.

With apple throwing a rainbow-hued paint bucked on the iMac last year, almost everyone was expecting them follow a similar design aesthetic and do the same thing—including white keyboards—with the new Air.

Sadly that did not happen.

What we got was the usual silver and space grey, with  two new colors: starlight (kind of a champaign gold) and midnight (a dark, dark navy that seems to border on black). While a new solid black would be welcome (anyone remember the black MacBook from years ago?), we didn't get that; nor did we get the expected white keyboards across the line.

I only half-listened to the presentations on iOS. It's not my focus. I don't hold nearly the amount of passion and engagement with my iPhone as I do with my Mac. It's a tool, nothing more.

And we got MacOS 13, also to be known as…

Ventura? Really? I know it's just a name and next year it will be something else, but with all the inspiring named locations in California you'd think they'd have gone with something a little more interesting. What's next? MacOS Oxnard?

I'll admit the default wallpaper is rather pretty.

I personally liked the name that had been floated prior to WWDC…

But I get it. Something lumbering and well, extinct (even though the name refers to Mammoth Lake (or maybe Mountain)—and not the long-dead mammal—isn't exactly the image Apple is trying to project.

Regardless of the name, some of the features and applications (to be honest, a lot of which are playing catch-up with Windows as well as Apple's own iOS) touted in Version 13 are interesting, but nothing that reached out and grabbed me, demanding "You need to install this beta NOW."

That said, will I upgrade when the final version becomes publicly available? Or even a late-stage beta on a separate partition? Of course I will. And I'll upgrade my iPhone to iOS16 when the final version is available as well (I don't mess around with betas on my phone)—even though I'll no doubt continue to utilize only a small fraction of what it's capable of doing.

And Then Just Like That…

Adobe Bridge is working again. The program starts up quickly, and thumbnails generate almost instantaneously. There's been no update from Adobe. I haven't changed anything within the program itself. This is beyond weird, but nonetheless welcome.

But now I've gotten so used to using XnView, does it even matter? Yeah, each program does certain things the other does not, but ultimately in the end my final cataloging process forces me back to Bridge. Will I switch back permanently? Probably not. XnView allows me to do several things Brige does not, including color-coding all the non-JPG or GIF images that need to be converted, making them stand out easily.

Venting

WARNING: This is gonna be a long one. You might just wanna skip over and head to the next nekkid man.

It started out simply enough. I wanted to throw together a document at work that explains the new way one of our home-brew applications works. It's something I regularly do and add to our department OneNote binder.

I made an initial screenshot and was surprised that I wasn't offered the option of opening it in Paint 3D as was usually the case. I went to open the application directly and it wasn't there. I had just used it a few days ago, but now it was gone. Completely.

How the fuck does that happen?

Suspecting that someone in Main ITS had been fucking around with something, I checked with a colleague to see if he had it on his workstation. He reported that no, it was gone from his as well.

Okay, I thought. No big deal. I'll just go out to the Microsoft Store and download it again.

What. The. Fuck?

Now I knew that Main ITS had locked down the store tighter than a virgin asshole, forcing us to use our corporate ID to access the store and then only offering a small smattering of apps to download, but I'd never run into it being blocked completely.

Maybe it was something with my ID? I logged out of the machine and logged back in with my administrative ID. Same issue. The same ID I use to set up new machines and update the pre-installed Store apps that ITS put on the image.

I logged into a desktop machine I use for purely administrative tasks and saw that Paint 3D was missing there as well. I attempted to reach the store on that machine and it got me right in. This was getting really weird.

I immediately suspected this was related to the fact that I had Windows 11 on my laptop and my admin desktop was still on Windows 10.

I went ahead and submitted a ticket about the missing application and the fact that the store was blocked on my laptop—knowing full well the Help Desk would just turn around and throw them back in our queue. To their credit, they only threw the "store blocked" ticket back at us. I got a call from a guy asking about the Paint 3D and he told me he could make it available in SCCM/Software Center for a direct download.

Sure enough, about ten minutes later I checked and it was available to download.

And the download failed. Repeatedly. I even rebooted to make sure.

I believe it was failing because it was trying to open the Microsoft Store, and since I couldn't get there to begin with it wouldn't install.

My next thought was that something in my profile on the machine was causing all these issues; some obscure bit flipped the wrong way. Recreating my profile from scratch would solve the issue surely.

Don't call me Shirley.

I logged back into the machine with my admin profile, renamed my regular user profile folder with a .bak extension, and cleared out the reference to that profile in the registry (exporting it to a .reg file first…just in case I needed it later.)

Logged out, logged in with my regular user credentials and waited while the machine churned away creating the profile. It churned and it churned, and after about 5 minutes I left to use the bathroom. When I got back ten minutes later it had logged in. But the task bar at the bottom of the screen was completely blank. No start button, no application icons, no clock…nothing.

I rebooted and logged back in. Same thing happened.

I went through the entire process one more time. Same result. Searching the internet for "empty task bar windows 11" brought up several solutions, none of which worked in my particular case.

It was time go go nuclear.

Due to the amount of additional applications and customization I'd done to my laptop, the last thing I wanted to do was wipe the entire thing and start fresh. The hardware was also going on five years old, so it was beyond time for me to replace it with something newer and the only thing which had prevented me from doing so for the last several months was the amount of work needed to put everything back the way it was. But now I was forced.

It was already late in the afternoon, but I manager to get a much newer laptop imaged almost precisely at quitting time. I thought I'd join it to the domain, load the half dozen default applications that are not part of the base image, get my standard account logged in and VPN set up and leave the rest (that I could do remotely) for the next day, which was a scheduled WFH day.

It wouldn't let me join the domain. My admin account was now locked out because of too many failed login attempts.

WHAT. THE.  FUCK.

I called the Help Desk to have the account unlocked (because my colleagues had all left for the day) and was informed that "We can't unlock admin accounts. I'll open a ticket for Security & Identity Management."

I told him not to bother. I was just going to go home and let it reset itself overnight. I realized any thought of working from home the next day had been destroyed as I could not remotely do any of the remaining tasks necessary to get a functioning laptop.

I texted my boss and let him know what had happened and that I'd be coming into the office tomorrow (today)…but just long enough to get my laptop functioning again and then would finish out the day at home. He was fine with that.

As I texted Ben as I left work, "I have never had a more emotionally frustrating day at work than I had today."

Fortunately, by noon today I got everything up and restored on the new laptop and was back home shortly after lunch.

I also did not update this machine to Windows 11—nor do I plan to.

2025 cannot come quickly enough.

I'm Such a Geek

Lately my friend Cindy has been having ongoing issues with the Mac Mini she purchased while Ben and I were in Denver. It's worked fine until right before the world locked down when it started slowing down to the point that it was nearly unusable. At the time, I backed up her data, wiped the drive, and reinstalled everything. That seemed to help, but a year ago it began taking upwards of five minutes to come online. She wasn't using it that much. Her husband had a new iMac for the family business, and she decided to just get a new iPad at the time.

Last week she called me because she wanted to upload the remaining data off the thing so it would be accessible on her iPad. Problem was, the Mac wasn't responding at all. She was able to get logged in, but then it locked up. I had her power cycle the machine and it then came back up with a totally black screen.

It sounded like something had died; I was hoping it was just the drive and I could pull it, slave it to my Mac and retrieve her data. The machine itself wasn't that important to her; she had planned on giving it to her nephew once all her stuff had been removed.

When things first started slowing down initially I suggested pulling the mechanical hard drive and slapping in a solid state drive. To be honest, I was hoping that she didn't want to do that because after watching videos on what was involved on getting the drive out of the Mini, I was intimidated as fuck. I'm a desktop PC hardware guy; I can pull one of those apart and put it back together in my sleep. But in the Apple world, I was sweating bullets just replacing the hard drive on my very first MacBook Pro.

Yet here we were. I knew I had to face my fears in order to get that drive out to troubleshoot further as it was coming up with a black screen when hooked up here as well.

Facing My Fears

Disassembling the Mini really wasn't that difficult. My anxiety level was high, but mitigated somewhat by knowing that if I did screw something up, it was not that big a deal since if she couldn't give the machine to her nephew as originally intended, it would go to the recycler.

Once I had the drive out, I connected it via a USB adapter to my Mac. It wasn't even recognized.

I wanted Cindy to be able to still give the machine to her nephew, so I'd ordered a 256GB SSD a couple days prior to put in the machine. The worst part of the reassembly was reattaching the power supply cabling to the system board, but once everything was back together, I hooked it up to the television in our living room (the most accessible thing with a HDMI port because hooking it to my monitor would involve lots of cabling stuff I didn't want to deal with). I powered it up and…flashing folder icon with a question mark.

Success! The power supply and system boards were both good. It was the drive itself that was causing the problem.

From there I was able to do an internet recovery to reload the OS. It even loaded Monterey, which was kind of a surprise.

Now that I've done the drive swap, I feel much more confident that I can do it again if the need ever arises.

Now the Bad News

Since the old drive wasn't being recognized at all with my Mac, I tried slaving it to my work laptop running Windows. I could hear the platters spinning and the heads moving, so I knew there wasn't anything physically wrong with the drive, and sure enough when I went into Disk Management on my Windows laptop, it showed up, along with it's various partitions.

I fired up the Disk Utility on my Mac and reattached it there. After several minutes, it showed up, so I attempted to run First Aid. After churning for several minutes it told me I needed to boot into Recovery Mode and run the repair there. I rebooted into Recovery Mode and fired up Disk Utility again. At this point it wasn't seeing the drive at all, so I called it a night and went to bed.

This morning it had appeared so I ran First Aid. I was not happy.

And of course, she had no backups. And I know her husband has no backup plan in place for his iMac, so we're going to have a little talk today.

 

 

Vintage Audio Pr0n

Everyone's heard of Marantz, Pioneer, Kenwood, Sansui, Sony, Technics, and Yamaha from the heyday of vintage audio, but I've always loved the look of Akai and thought their stuff was overlooked and underrated in the audio community.

I wanted that cassette deck in the worst way, but of course, it was way out of my budget.