[-] mackwinston@feddit.uk 28 points 1 year ago

An older friend of mine told me years back about an incident that happened on a university VAX running Unix. In those days, everyone was using vt100 terminals, and the disk drives weren't all that quick. He was working on his own terminal when without warning, he got this error when trying to run a common command (e.g. ls)

$ ls -l
sh: ls: command not found

So he went on over to the system admin's office, where he found the sysadmin and his assistant, staring at their terminal in frozen horror. Their screen had something like:

# rm -rf / tmp/*.log
^C^C^C^C^C^C^C^C^C^C
# ls -l
sh: ls: command not found
# stat /bin/ls
sh: stat: command not found

A few seconds after hitting return, and the rm command not finishing immediately, he realised about the errant space, and then madly hammered Ctrl-C to try to stop it. It turns out that the disk was slow enough that not everything was lost, and by careful use of the commands that hadn't been deleted, managed to copy the executables off another server without having to reinstall the OS.

[-] mackwinston@feddit.uk 13 points 2 years ago* (last edited 2 years ago)

I think 30fps (25fps in PAL-land) became the standard because televisions were 30 FPS (NTSC) or 25 FPS (PAL) due to interlacing. While the screen redraw on a NTSC television is 60 per second, it's done as two fields so you only get 30 actual frames per second. This was done so you could have a decent resolution (525 lines for NTSC or 625 lines for PAL) while maintaining reasonable RF bandwidth limits for the TV signal by sending a single frame as two fields, half of the picture in each field on alternate TV scanlines.

So you probably have a lot of industry inertia to deal with so 30 fps (or 25 fps where PAL was formerly the standard) ends up being the standard. And for video it's good enough (although 60fps/50fps is still better - until fairly recently, this would entail too much bandwidth so sticking with the old NTSC or PAL frame rates made sense).

But for computers no one really used interlaced displays because they are awful for displaying the kind of things computers usually show (the flicker is terrible with a static image in an interlaced screen mode. While it's true there were some interlace modes, nearly everyone tried to avoid them. The resolution increase wasn't worth the god-awful flicker). So you always had 60 Hz progressive scan on the old computer CRTs (or in the case of original VGA, IIRC it was 70 Hz). To avoid tearing, any animated content on a PC would use the vsync to stay synchronized with the CRT and this is easiest to do at the exact frequency of the CRT and provided very smooth animation, especially in fast moving scenes. Even the old 8-bit systems would run at 60 (NTSC) or 50 (PAL) FPS (although 1980s 8-bit systems were generally not doing full screen animation, usually it was just animating parts of the screen).

So a game should always be able to hit at least 60 frames per second. If the computer or GPU is not powerful enough and the frame rate falls below 60 fps, the game can no longer use the vsync to stay synchronized with the monitor's refresh, and you get judder and tearing.

Virtual reality often demands more (I think the original Oculus Rift requires 90 fps) and has various tricks to ensure the video is always generated at 90 fps, and if the game can't keep up, frames get interpolated (see "asynchronous space warp") although if you're using VR if you can't hit the native frame rate, it's generally awful having to rely on asynchronous space warp which inevitably ends up distorting some of the grpahics and adding some pretty ugly artifacts.

[-] mackwinston@feddit.uk 17 points 2 years ago

Honda. The answer is Honda.

[-] mackwinston@feddit.uk 15 points 2 years ago

Debian (a very conservative distro) switched to Wayland by default in debian 10 if I'm not mistaken (we're now on 12).

I didn't notice the change until I tried to run a niche program that really needs X11. Unless you're doing this kind of thing, then you can probably just use Wayland. At least in Debian it's really easy to switch between Wayland and X11 by selecting the session type when you log in.

[-] mackwinston@feddit.uk 19 points 2 years ago* (last edited 2 years ago)

https://www.youtube.com/watch?v=iYWzMvlj2RQ

"I'm also very happy to point out that nVidia has been the worst [...] so nVidia, "fuck you!""

[-] mackwinston@feddit.uk 37 points 2 years ago* (last edited 2 years ago)

If people rely on driving for their work or independence, they should not be using their phones while driving. It's not hard. A friend of mine is a train driver and you can imagine that being caught using your phone in that job is instant dismissal. His solution is to turn the phone off and put it in his bag, therefore there can be no temptation to use the phone and absolute proof in the case of an incident that phone usage wasn't part of it. If a motorist can't resist the temptation to use their phone, they should be doing the same.

The overwhelming majority of people 'caught' by Mikey seem to be using social media, not taking urgent work calls.

It is still dangerous to use the phone in traffic jams, because what phone users do while texting or doing Instagram is to be looking down while using their peripheral vision to see if traffic is moving, or even less. So they see a movement and move off, not having seen the pedestrian crossing through the gaps. I've witnessed a crash caused by such a distracted driver - albeit it was in Houston - the phone user next to us heard a car horn from behind and without looking just went and hit the car in front. Had there been someone crossing the road in front they would have been crushed.

Being in a traffic jam is still actively driving. Mikey might not be a hero, but calling him a "tool of the oppression of the state" is severely overegging the pudding, when to avoid such "oppression" all you have to do is not use your phone and pay attention to driving.

[-] mackwinston@feddit.uk 20 points 2 years ago

All motorists are loud. Cities aren't loud, cars are loud. https://www.youtube.com/watch?v=CTV-wwszGw8

[-] mackwinston@feddit.uk 55 points 2 years ago

.rar is an awful proprietary format that needs to die, and die soon. You should NEVER use .rar files when sending files to others due to its closed proprietary nature.

.zip is preferable because everyone can handle it by default. 7z is OK because nearly everyone can handle it by default and it is an open format.

[-] mackwinston@feddit.uk 14 points 2 years ago

We were out in a group and very drunk, and he said I could kiss him, but it ended up being this weird lunge and everyone fell about laughing.

He did stick his hands down my shorts later, so it wasn't entirely a failure...

[-] mackwinston@feddit.uk 20 points 2 years ago

tl;dr would be costs for the chipshops are too high, and customers aren't willing to pay enough to keep them in business.

[-] mackwinston@feddit.uk 13 points 2 years ago

Context is important. From the context, the OP was talking just about disposable vapes.

view more: next ›

mackwinston

joined 2 years ago