• 4 Posts
  • 187 Comments
Joined 1 year ago
cake
Cake day: October 4th, 2023

help-circle
  • Today’s users have massive amounts of computer power at their disposal, thanks to sales of billions of desktop and laptop PCs, tablets and smartphones. They’re all programmable. Users should be able to do just enough programming to make them work the way they want. Is that too much to ask?

    Smartphones – and to a lesser degree, tablets – kind of are not a phenomenal programming platform. Yeah, okay, they have the compute power, but most programming environments – and certainly the ones that I’d consider the best ones – are text-based, and in 2025, text entry on a touchscreen still just isn’t as good as with a physical keyboard. I’ll believe that there is room to considerably improve on existing text-entry mechanisms, though I’m skeptical that touchscreen-based text entry is ever going to be at par with keyboard-based text entry.

    You can add a Bluetooth keyboard. And it’s not essential. But it is a real barrier. If I were going to author Android software, I do not believe that I’d do the authoring on an Android device.

    When Dartmouth College launched the Basic language 50 years ago, it enabled ordinary users to write code. Millions did. But we’ve gone backwards since then, and most users now seem unable or unwilling to create so much as a simple macro

    I don’t know about this “going backwards” stuff.

    I can believe that a higher proportion of personal computer users in 1990 could program to at least some degree than could the proportion of, say, users of Web-browser-capable devices today.

    But not everyone in 1990 had a personal computer, and I would venture to say that the group that did probably was not a representative sample of the population. I’d give decent odds that a lower proportion of the population as a whole could program in 1990 than today.

    I do think that you could make an argument that the accessibility of a programming environment somewhat-declined for a while, but I don’t know about it being monotonically.

    It was pretty common, for personal computers around 1980, to ship with some kind of BASIC programming environment. Boot up an Apple II, hit…I forget the key combination, but it’ll drop you straight into a ROM-based BASIC programming environment.

    After that generation, things got somewhat weaker for a time.

    DOS had batch files. I don’t recall whether QBasic was standard with the OS. checks it did for a period with MS-DOS, but was a subset of QuickBasic. I don’t believe that it was still included by later in the Windows era.

    The Mac did not ship with a (free) programming environment.

    I think that that was probably about the low point.

    GNU/Linux was a wild improvement over this situation.

    And widespread Internet availability also helped, as it made it easier to distribute programming environments and tools.

    Today, I think that both MacOS and Windows ship with somewhat-more sophisticated programming tools. I’m out of date on MacOS, but last I looked, it had access to the Unix stuff via brew, and probably has a set of MacOS-specific stuff out there that’s downloadable. Windows ships with Powershell, and the most-basic edition of Visual Studio can be downloaded gratis.


  • I’m okay with game prices going up – they’ve fallen far behind inflation over the decades – though personally I favor DLC rather than one large shebang. Lower risk on both sides.

    And there are a lot of games out there that, when including DLC, run much more than $100. Think of The Sims series or a lot of Paradox games. Stellaris is a fun, sprawling game, but with all DLC, it’s over $300, and it’s far from the priciest.

    But if I’m paying more, I also want to get more utility out of what you’re selling. If a game costs $100, I expect to get twice what I get out of a competing $50 game.

    And to be totally honest, most of the games that I really enjoy have complex mechanics and have the player play over and over again. I think that most of the cost that game studios want is for asset creation. That can be okay, depending upon genre – graphics are nice, music is nice, realistic motion-capture movement is nice – but that’s not really what makes or breaks my favorite games. The novelty kind of goes away once you’ve experienced an asset a zillion times.




  • tal@lemmy.todaytoTechnology@lemmy.worldTerminal colours are tricky
    link
    fedilink
    English
    arrow-up
    80
    arrow-down
    1
    ·
    edit-2
    2 days ago

    Not to mention that the article author apparently likes dark-on-light coloration (“light mode”), whereas I like light-on-dark (“dark mode”).

    Traditionally, most computers were light-on-dark. I think it was the Mac that really shifted things to dark-on-light:

    My understanding from past reading was that that change was made because of the observation that at the time, people were generally working with computer representations of paper documents. For ink economy reasons, paper documents were normally dark-on-light. Ink costs something, so normally you’d rather put ink on 5% of the page rather than 95% of the page. If you had a computer showing a light-on-dark image of a document that would be subsequently printed and be dark-on-light on paper, that’d really break the WYSIWYG paradigm emerging at the time. So word processors and the like drove that decision to move to dark-on-light:

    Prior to that, a word processor might have looked something like this (WordPerfect for DOS):

    Technically, I suppose it wasn’t the Mac where that “dark-on-light-following-paper” convention originated, just where it was popularized. The Apple IIgs had some kind of optional graphical environment that looked like a proto-Mac environment, though I rarely saw it used:

    Update: apparently that wasn’t actually released until after the Mac. This says that that graphical desktop was released in 1985, while the original 128K Mac came out in 1984. So it’s really a dead-end side branch offshoot, rather than a predecessor.

    The Mac derived from the Lisa at Apple (which never became very widespread):

    And that derived from the Xerox Alto:

    But for practical purposes, I think that it’s reasonably fair to say that the Mac was really what spread dark-on-light. Then Windows picked up the convention, and it was really firmly entrenched:

    Prior to that, MS-DOS was normally light-on-dark (with the basic command line environment being white-on-black, though with some apps following a convention of light on blue):

    Apple ProDOS, widely used on Apple computers prior to the Mac, was light-on-dark:

    The same was true of other early text-based PC environments, like the Commodore 64:

    Or the TRS-80:

    1000009146

    When I used VAX/VMS, it was normally off a VT terminal that would have been light-on-dark, normally green, amber, or white on black, depending upon the terminal:

    And as far as I can recall, terminals for Unix were light-on-dark.

    If you go all the way back before video terminals to teleprinters, those were putting their output directly on paper, so the ink issue comes up again, and they were dark-on-light:

    But I think that there’s a pretty good argument that, absent ink economy constraints, the historical preference has been to use light-on-dark on video displays.

    There’s also some argument that for OLED displays – and, one assumes, any future displays, where you only light up what needs to be light, rather than the LCD approach of lighting the whole thing up and then blocking and converting to heat what you don’t want to be light – draw somewhat less power for light-on-dark. That provides some battery benefits on portable devices, though in most cases, that’s probably not a huge issue compared to eye comfort.






  • In August 1993, the project was canceled. A year of my work evaporated, my contract ended, and I was unemployed.

    I was frustrated by all the wasted effort, so I decided to uncancel my small part of the project. I had been paid to do a job, and I wanted to finish it. My electronic badge still opened Apple’s doors, so I just kept showing up.

    I asked my friend Greg Robbins to help me. His contract in another division at Apple had just ended, so he told his manager that he would start reporting to me. She didn’t ask who I was and let him keep his office and badge. In turn, I told people that I was reporting to him. Since that left no managers in the loop, we had no meetings and could be extremely productive.

    They created a pretty handy app that was bundled with the base OS, and which I remember having fun using. So it’s probably just as well that Apple didn’t hassle them. But in all seriousness, that’s not the most amazing building security ever.

    reads further

    Hah!

    We wanted to release a Windows version as part of Windows 98, but sadly, Microsoft has effective building security.


  • It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

    And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

    At least some Dell laptops authenticate to the charger so that only “authentic Dell chargers” can charge the battery, though they’ll run off third-party chargers without charging the battery.

    Unfortunately, it’s a common problem – and I’ve seen this myself – for the authentication pin on an “authentic Dell charger” to become slightly bent or something, at which it will no longer authenticate and the laptop will refuse to charge the battery.

    I bet the charger on yours is a barrel charger with that pin down the middle.

    hits Amazon

    Yeah, looks like it.

    https://www.amazon.com/dp/B086VYSZVL?psc=1

    I don’t have a great picture for the 65W one, but the 45W charger here has an image looking down the charger barrel showing that internal pin.

    If you want to keep using that laptop and want to use the battery, I’d try swapping out the charger. If you don’t have an official Dell charger, make sure that the one you get is one of those (unless some “universal charger” has managed to break their authentication scheme in the intervening years; I haven’t been following things).

    EDIT: Even one of the top reviews on that Amazon page mentions it:

    I have a DELL, that has the straight barrel plug with the pin in it. THEY REALLY made a BAD DECISION when they made these DELL laptops with that type of plug instead of making it with a dog leg style plug. I have to replace my charger cord A LOT because the pin gets bent inside and it stops charging at that plug, but the rest of the charger is still good…


  • Up until the early 2000s, serial computation speed doubled about every 18 months. That meant that virtually all software just ran twice as quickly every 18 months of CPU advances. And since taking advantage of that was trivial, new software releases did, traded CPU cycles for shorter development time or more functionality, demanded current hardware to run at a reasonable clip.

    In that environment, it was quite important to upgrade the CPU.

    But that hasn’t been happening for about twenty years now. Serial computation speed still increases, but not nearly as quickly any more.

    This is about ten years old now:

    https://preshing.com/20120208/a-look-back-at-single-threaded-cpu-performance/

    Throughout the 80’s and 90’s, CPUs were able to run virtually any kind of software twice as fast every 18-20 months. The rate of change was incredible. Your 486SX-16 was almost obsolete by the time you got it through the door. But eventually, at some point in the mid-2000’s, progress slowed down considerably for single-threaded software – which was most software.

    Perhaps the turning point came in May 2004, when Intel canceled its latest single-core development effort to focus on multicore designs. Later that year, Herb Sutter wrote his now-famous article, The Free Lunch Is Over. Not all software will run remarkably faster year-over-year anymore, he warned us. Concurrent software would continue its meteoric rise, but single-threaded software was about to get left in the dust.

    If you’re willing to trust this line, it seems that in the eight years since January 2004, mainstream performance has increased by a factor of about 4.6x, which works out to 21% per year. Compare that to the 28x increase between 1996 and 2004! Things have really slowed down.

    We can also look at about the twelve years since then, which is even slower:

    https://www.cpubenchmark.net/compare/2026vs6296/Intel-i7-4960X-vs-Intel-Ultra-9-285K

    This is using a benchmark to compare the single-threaded performance of the i7 4960X (Intel’s high-end processor back at the start of 2013) to that of the Intel Ultra 9 285K, the current one. In those ~12 years, the latest processor has managed to get single-threaded performance about (5068/2070)=~2.448 times the 12-year-old processor. That’s (5068/2070)^(1/12)=1.07747, about a 7.7% performance improvement per year. The age of a processor doesn’t matter nearly as much in that environment.

    We still have had significant parallel computation increases. GPUs in particular have gotten considerably more powerful. But unlike with serial compute, parallel compute isn’t a “free” performance improvement – software needs to be rewritten to take advantage of that, it’s often hard to parallelize solving problems, and some problems cannot be solved in parallel.

    Honestly, I’d say that the most-noticeable shift is away from rotational drives to SSDs – there are tasks for which SSDs can greatly outperform rotational drives.


  • I’ve played it before on Linux.

    In general, you can just check ProtonDB, which will have an entry for the games with status reports.

    It has a Gold status.

    https://www.protondb.com/app/47890

    In 2025, my general take for Steam games isn’t even to check any more. I can list a very few games that I want to play that don’t work – Command: Modern Operations is the most prominent. But it’s pretty rare now.

    The main issue that comes up is some low level anticheat stuff used in some multiplayer competitive games, like first person shooter deathmatch games, which doesn’t necessarily like Linux. Not a genre I play any more.

    EDIT: For anyone else who is interested in Command: Modern Operations, I’m looking forward to Sea Power: Naval Combat in the Missile Age coming out of Early Access, as it’s probably the closest thing to the above game and does run on Proton. Still has a lot of work left, though – the manual, which is normally quite substantial for milsims like this, is barely a few notes at this point.




  • I’ve kind of felt the same way, would rather have a somewhat-stronger focus on technology in this community.

    The current top few pages of posts are pretty much all just talking about drama at social media companies, which frankly isn’t really what I think of as technology.

    That being said, “technology” kind of runs the gamut in various news sources. I’ve often seen “technology news” basically amount to promoting new consumer gadgets, which isn’t exactly what I’d like to see from the thing, either. I don’t really want to see leaked photos of whatever the latest Android tablet from Lenovo or whatever is either.

    I’d be more interested in reading about technological advances and changes.

    I suppose that if someone wants to start a more-focused community, I’d also be willing to join that, give it a shot.

    EDIT: I’d note that the current content here kind of mirrors what’s on Reddit at /r/Technology, which is also basically drama at social media companies. I suppose that there’s probably interest from some in that. It’s just not really what I’m primarily looking for.



  • I think that California should take keeping itself competitive as a tech center more-seriously. I think that a lot of what has made California competitive for tech is because it had tech from earlier, and that at a certain threshold, it becomes advantageous to do more companies in an area – you have a pool of employees and investors and such. But what matters is having a sufficiently-large pool, and if you let that advantage erode sufficiently, your edge also goes away.

    We were just talking about high California electricity prices, for example. A number of datacenters have shifted out of California because the cost of electricity is a significant input. Now, okay – you don’t have to be right on top of your datacenters to be doing tech work. You can run a Silicon Valley-based company that has its hardware in Washington state, but it’s one more factor that makes it less appealing to be located in California.

    The electricity price issue came up a lot back when people were talking about Bitcoin mining more, since there weren’t a whole lot of inputs and it’s otherwise pretty location-agnostic.

    https://www.cnbc.com/2021/09/30/this-map-shows-the-best-us-states-to-mine-for-bitcoin.html

    In California and Connecticut, electricity costs 18 to 19 cents per kilowatt hour, more than double that in Texas, Wyoming, Washington, and Kentucky, according to the Global Energy Institute.

    (Prices are higher now everywhere, as this was before the COVID-19-era inflation, but the fact that California is still expensive electricity-wise remains.)

    I think that there is a certain chunk of California that is kind of under the impression that the tech industry in California is a magic cash cow that is always going to be there, no matter what California does, and I think that that’s kind of a cavalier approach to take.

    EDIT: COVID-19’s remote-working also did a lot to seriously hurt California here, since a lot of people decided “if I don’t have to pay California cost-of-living and can still keep the same job, why should I pay those costs?” and just moved out of state. If you look at COVID-19-era population-change data in counties around the San Francisco Bay Area, it saw a pretty remarkable drop.

    https://www.apricitas.io/p/california-is-losing-tech-jobs

    California is Losing Tech Jobs

    The Golden State Used to Dominate Tech Employment—But Its Share of Total US Tech Jobs has Now Fallen to the Lowest Level in a Decade

    Nevertheless, many of the tech industry’s traditional hubs have indeed suffered significantly since the onset of the tech-cession—and nowhere more so than California. As the home of Silicon Valley, the state represented roughly 30% of total US tech sector output and got roughly 10% of its statewide GDP from the tech industry in 2021. However, the Golden State has been bleeding tech jobs over the last year and a half—since August 2022, California has lost 21k jobs in computer systems design & related, 15k in streaming & social networks, 11k in software publishing, and 7k in web search & related—while gaining less than 1k in computing infrastructure & data processing. Since the beginning of COVID, California has added a sum total of only 6k jobs in the tech industry—compared to roughly 570k across the rest of the United States.

    For California, the loss of tech jobs represents a major drag on the state’s economy, a driver of acute budgetary problems, and an upending of housing market dynamics—but most importantly, it represents a squandering of many of the opportunities the industry afforded the state throughout the 2010s.


  • tal@lemmy.todaytoLinux@lemmy.worldFinally did it
    link
    fedilink
    English
    arrow-up
    3
    ·
    11 days ago

    I mean, you can run an infrared remote. I don’t know if there’s specifically an open source infrared remote out there, but I wouldn’t be surprised, as they aren’t very complicated devices.

    On the software side, you’re going to want LIRC. It’ll have a list of supported receivers.

    https://www.lirc.org/

    kagis

    Here’s an open-source infrared remote:

    https://github.com/CoretechR/OMOTE

    Personally, I wouldn’t care if the remote is open source any more than I would my keyboard, but if it’s interesting to you…