I’ve noticed similar “mini” news stories trickle out after Apple’s announcements. Does this happen organically, or does PR drop tidbits like this to select sources?
It seems like a very specific thing for a reporter to ask and find out about.
Look at the Tweet (X? Blurp? What do we call them now?) - it's got the spectrum of the panel, comparing previous and newer panels.
If you know what you're looking for in those, you can identify a lot of different phosphor configurations just by the particular shape of the RGB peaks - the older ones have a distinctive multi-peaked red emission that I've seen in various LED bulbs as well over the years.
I doubt Apple mentioned it to anyone. Applying a spectrometer to any new light emitting device is just the sort of thing some people enjoy doing.
They're officially just called "Posts" now. It's a hell of a downgrade from how distinctive the old terms were, no wonder people still call them Tweets.
Community Notes was also set to be called Birdwatch originally, continuing the bird pun theme.
Many people remain fascinated by Apple and the small choices that (traditionally) give their products a sense of careful and attentive design and engineering.
So there's both a supply of people eager to pick their products apart and a market of people eager to hear about all the little details and secrets.
While Apple probably does seed some stories intentionally, as their PR teams are sharp, they don't need to be doing so for swarms of these reports to pop up after announcements and first shipments.
It could simply be the people are now getting their hands on them and testing them for things that Apple didn’t specifically say in their announcements.
The M3 line with 256Gb storage had a single SSD NAND chip which made it measurably slower than the M2 series with the same amount. Although irrelevant for most daily work it was a regression which seems to be fixed in the M4 line. Even then, I presume such a bit of bad news would trigger people looking for 'the best spec' to buy the storage upgrade.
I'm pretty sure apple is just a marketing machine. They have pro apple posts and smear campaigns on all samsung forums. Main stream media marketing but also guerrilla marketing on forums, social media, even newspaper comments section. I only see this kind of thing from russian propaganda.
Interesting. As I understand it, shifting the red curve to shorter wavelengths, even by a seemingly small amount, would improve visibility. And something I've learned is that red vision varies by a fair amount from person to person.
>Is there vision tests similar to audio tests where they figure out one's individual responses to different wavelengths of light? Super neat.
Unlike consumer audio equipment where you can easily do a frequency sweep to test hearing, you'd need a specialist light source to do the same. Something like a tunable laser. You could probably use a prism to do a similar sweep from a white light source.
You don't need a frequency sweep. You take three broad spectrum lights and ask people to adjust their brightnesses to match a selection of reference lights. The tool that does this is called an anomaloscope and it was invented before things like lasers in order to study how color vision worked. That work became the basis of CIE (and other standards) that now define how your screen renders accurate colors.
This setup is straightforward to adjust to different types of color vision too. They use 3 lights because that's how many opsins normal humans use for color vision. If you're testing di- or tetrachromats you can use 2 or 4 lights respectively, or 12 if you're testing intelligent mantis shrimp.
Does this mean better motion response times? The M-series MacBook Pro displays have notoriously smeary displays while displaying high-motion content, so this would be a welcome addition.
It shouldn't make a difference. The film is illuminated by a blue LED and constantly glows uniformly yellow, which is the same mechanism as the white LEDs in a traditional display (blue emitter illuminates yellow phosphor coating). The LCD filters this to make specific pixels and would be more responsible. I worked for a now defunct QD company.
The way I thought LCD/LED displays worked was by RGB filtering a uniform white backlight. Is it only this design that does fosforescence per subpixel? Sounds way more energy efficient.
Oh, phosphorescence per subpixel (instead of flooding all in "white" that was created through phosphorescence from blue in a central place) sounds like an awesome power optimization for where you still want/need subtractive LCD instead of some *LED with additive per-(sub)pixel emitter.
For those following outdoor sports tech, I wonder if this might be the secret sauce that allowed Garmin to abandon transflective screens in the Edge 1050, which unlike their post-transflective watches is still technically an LCD. (the not secret at all meat of the change is a depressingly massive battery and big loss in runtime, but I suspect that the battery alone isn't enough to explain that the loss isn't even bigger)
So far, the answer anecdotally is no, at least not in situations where lit pixels are moved quickly into black areas. In practice, my obnoxious green text black background terminal was kind of gross to scroll, but haven't experimented much with others yet. Playing games has thus far been fine, scrolling in other contexts is fine for practical purposes. Happy to update this if you want after I ruin my new MacBook by experimenting more
Certainly a clever way to test it. In terms of response, I meant that at least on a fully black background, you should see ghost lines of text between where the line of text was and where it's going, almost like a mouse cursor with a trail
LCD BLUs have a uniformly glowing background which is filtered by the LCD to make pixels. If there is delay in pixels updating, it would be the LCD causing it.
Is there more to the thread or just this one tweet/X thing? Response times notoriously suck on MacBooks, it would be nice to see that remedied, anecdotally it doesn't seem like that's happened yet.
Edit: Nevermind, same tweet seems to have been quoted across a bunch of different other news sites. Apparently Blur Busters claims an improvement, I'll try it out and see how it is in some other contexts.
If you're not logged in to Xitter, navigating to a Xeet allows you to view the Xeet, but not the Xomments. Fortunately, there are open-source, self-hostable, privacy-preserving front-ends for Xitter, such as Nitter.
If setting it up yourself is too much work, you can use other public instances. One such instance is called xcancel. Load the Xeet as normal, then simply append "cancel" to the domain name before the period in your URL bar and hit enter :)
No need for workarounds. Only the flagship nitter.net was blocked. Nitter is an open source project and maintains a list of working instances on its wiki: https://github.com/zedeus/nitter/wiki/Instances
I remember hearing something like "the way Nitter worked, guest accounts, was shut down and the 3 instances that remain work by using selfbotting" a while ago, it's logical although I haven't verified it
What’s selfbotting? Based on the name it sounds like something that requires me to surrender my own authentication token to some automation service… but that’s definitely not the case for these alternative Nitter frontends.
I’m not sure how Twitter ultimately blocked them. It would be pretty embarassing (for Twitter) if it were a simple IP block of the Nitter.net servers, but that doesn’t seem too out of whack with Musk’s history of litigating bot behavior…
Who manufactures their displays? I'm guessing they have more influence in the design or manufacturing than most players, but is this just a matter of them telling Samsung/LG/etc "ok, we're going to use your quantum dot displays now"?
They source from a combination of Samsung, LG, and BOE (Chinese display manufacturer). The way the arrangement typically works is that manufacturers will send Apple preproduction samples and Apple will decide which are worth using for upcoming SKUs. The manufacturer will build out production facilities to meet that demand and whatever specs Apple wants. Apple may also help with investment or R&D to develop products to meet feature roadmap targets and increase supplier competition. It's a very dangerous game for the manufacturers.
Also, from my limited experience with a single OLED screen, it seems that most stuff was created for a certain kind of screen without as much colour fidelity, and now that stuff seems far more...obnoxiously "saturated"?...on an OLED screen.
> I've heard that there are screen lifetime issues?
This has gotten much, much better, especially with "tandem OLED" where you just stack two of 'em on top of each other. It should be fine these days.
> Also, from my limited experience with a single OLED screen, it seems that most stuff was created for a certain kind of screen without as much colour fidelity, and now that stuff seems far more...obnoxiously "saturated"?...on an OLED screen.
That's up to the display manufacturer to calibrate the screen. The content should just be what it is and specify its colorspace properly. (Note, "properly" depends on the environment around you, so if you really care about this you have to participate too.)
Lots of devices that come with OLED displays come with a "vibrancy" mode turned on by default that oversaturates colors until you turn it off. It does look great at a glance tho!
Conversely lots of contents is produced on/for less-than-stellar displays and gamma+color-profiles-be-damned overcompensate with more saturated colours at the data level because it's going to show up toned down.
When a display is actually able to put out the colours it then looks gaudily oversaturated. I've had such problems already with non-OLED "somewhat† calibrated" good quality screens as well.
† I mean I did not calibrate them, they were factory calibrated with a good enough test curve slip in the package.
Pixel aperture ratio has increased drastically since the early displays. This drops current density for a given amount of light output, and there's a nonlinear relationship between current density and segregation so that helps a ton.
Deuterium helps make more light per unit current, improves current density, improves lifetime.
Microlensing of your customers will accept narrower viewing angle, improves brightness and lifetime in the same way.
There was a time that OLED problems were so huge Lenovo cancelled their usage on laptops for many years (e.g X1 Yoga Series). It was so bad that I got the next generation laptop for free when it was released.
OLED has had lower peak brightness than IPS. It may not be perceptually so because of no-backlight absolute blacks and higher contrast, but the difference starts to matter in broad daylight where OLED may not be bright enough, irrespective of matte vs glossy.
the pixel response and contrast absolutely are. Battery life is a little worse (especially in bright mode). OLED pixel response is around 100 micro-seconds compared to ~5ms for IPS, and each pixel dims individually allowing for actually good HDR
If you buy a macbook it's supposed to last a long time, but I'm kind of skeptical of getting one right when they release instead of a tried and tested IPS mbp
Doesn’t OLED pixel layout not line up with modern text rendering engines? At least that’s what I believe I’ve read from reports on banding around text on Windows in particular that makes long-running text work a problem.
Shouldn’t be an issue under macOS for the most part, which has used grayscale antialiasing for several years since subpixel AA isn’t of much benefit with HiDPI displays and complicates text rendering considerably.
If there are any problems, it’ll probably be with cross platform software that doesn't use native text rendering and assumes RGB subpixel arrangements instead of obeying the system.
Freetype also doesn't do the right thing on RGBW at least.
Reading code in the repo suggests it's possible to reconfigure it to work properly with the Harmony algorithm but I haven't worked out how yet.
If anyone knows how to sponsor efforts to fix this I would totally contribute to that.
A separate problem is that I don't think there is a standard way for monitors to communicate the subpixel layout in such a way the font rendering engine will have access to it. That seems like a pretty big oversight when introducing these in the first place.
interesting, We have a failed Samsung QD TV and when we called for a technical support the guy correctly guessed the screen issue as if he has seen that many times before. What makes them unreliable? The problem isn't even in the QD film itself but the LED array, an LED fails and shorts then the other LEDs start working at higher voltage and overheat and causing cascading effect where the problem starts as small and develops into unusable.
I’ve noticed similar “mini” news stories trickle out after Apple’s announcements. Does this happen organically, or does PR drop tidbits like this to select sources?
It seems like a very specific thing for a reporter to ask and find out about.
Look at the Tweet (X? Blurp? What do we call them now?) - it's got the spectrum of the panel, comparing previous and newer panels.
If you know what you're looking for in those, you can identify a lot of different phosphor configurations just by the particular shape of the RGB peaks - the older ones have a distinctive multi-peaked red emission that I've seen in various LED bulbs as well over the years.
I doubt Apple mentioned it to anyone. Applying a spectrometer to any new light emitting device is just the sort of thing some people enjoy doing.
> the Tweet (X? Blurp? What do we call them now?)
They're officially just called "Posts" now. It's a hell of a downgrade from how distinctive the old terms were, no wonder people still call them Tweets.
Community Notes was also set to be called Birdwatch originally, continuing the bird pun theme.
Maybe you can take some solace from the proliferation of terms for what were once universally referred to as daughterboards.
> Look at the Tweet (X? Blurp? What do we call them now?)
X-crement
Judges would have also accepted "X-ude".
xeet, with the x pronounced like Nahuatl: "sh"
https://pages.ucsd.edu/~dkjordan/resources/PronouncingNahuat...
Now you can say things like, "I just xeeted" or "Did you see that xeet?"
It's also the first step in professional display calibration.
Xits
Many people remain fascinated by Apple and the small choices that (traditionally) give their products a sense of careful and attentive design and engineering.
So there's both a supply of people eager to pick their products apart and a market of people eager to hear about all the little details and secrets.
While Apple probably does seed some stories intentionally, as their PR teams are sharp, they don't need to be doing so for swarms of these reports to pop up after announcements and first shipments.
It could simply be the people are now getting their hands on them and testing them for things that Apple didn’t specifically say in their announcements.
That's my interpretation too, people start finding out stuff when they get their hands on it, and each has their own interests.
I think Apple does not specify to give themselves flexibility to change. It’s not sure everyone will get the same panel.
I think Apple doesn’t communicate it because it makes some of the laptops with „the same” spec better and some worse.
I remember that was the case with ssds some time ago - some of the macbooks had a better one, some had a slightly worse one.
The M3 line with 256Gb storage had a single SSD NAND chip which made it measurably slower than the M2 series with the same amount. Although irrelevant for most daily work it was a regression which seems to be fixed in the M4 line. Even then, I presume such a bit of bad news would trigger people looking for 'the best spec' to buy the storage upgrade.
I'm pretty sure apple is just a marketing machine. They have pro apple posts and smear campaigns on all samsung forums. Main stream media marketing but also guerrilla marketing on forums, social media, even newspaper comments section. I only see this kind of thing from russian propaganda.
Interesting. As I understand it, shifting the red curve to shorter wavelengths, even by a seemingly small amount, would improve visibility. And something I've learned is that red vision varies by a fair amount from person to person.
> And something I've learned is that red vision varies by a fair amount from person to person.
Is there vision tests similar to audio tests where they figure out one's individual responses to different wavelengths of light? Super neat.
It would be cool to simulate different people's vision, not just colour-blindness but the more subtle variations.
>Is there vision tests similar to audio tests where they figure out one's individual responses to different wavelengths of light? Super neat.
Unlike consumer audio equipment where you can easily do a frequency sweep to test hearing, you'd need a specialist light source to do the same. Something like a tunable laser. You could probably use a prism to do a similar sweep from a white light source.
You don't need a frequency sweep. You take three broad spectrum lights and ask people to adjust their brightnesses to match a selection of reference lights. The tool that does this is called an anomaloscope and it was invented before things like lasers in order to study how color vision worked. That work became the basis of CIE (and other standards) that now define how your screen renders accurate colors.
This setup is straightforward to adjust to different types of color vision too. They use 3 lights because that's how many opsins normal humans use for color vision. If you're testing di- or tetrachromats you can use 2 or 4 lights respectively, or 12 if you're testing intelligent mantis shrimp.
Does this mean better motion response times? The M-series MacBook Pro displays have notoriously smeary displays while displaying high-motion content, so this would be a welcome addition.
It shouldn't make a difference. The film is illuminated by a blue LED and constantly glows uniformly yellow, which is the same mechanism as the white LEDs in a traditional display (blue emitter illuminates yellow phosphor coating). The LCD filters this to make specific pixels and would be more responsible. I worked for a now defunct QD company.
The way I thought LCD/LED displays worked was by RGB filtering a uniform white backlight. Is it only this design that does fosforescence per subpixel? Sounds way more energy efficient.
Oh, phosphorescence per subpixel (instead of flooding all in "white" that was created through phosphorescence from blue in a central place) sounds like an awesome power optimization for where you still want/need subtractive LCD instead of some *LED with additive per-(sub)pixel emitter.
For those following outdoor sports tech, I wonder if this might be the secret sauce that allowed Garmin to abandon transflective screens in the Edge 1050, which unlike their post-transflective watches is still technically an LCD. (the not secret at all meat of the change is a depressingly massive battery and big loss in runtime, but I suspect that the battery alone isn't enough to explain that the loss isn't even bigger)
Notebookcheck says no. Their M4 Pro only did 5-10% better than last year which is still bad.
> latest Cd-free QD films are very efficient, feature as good or better color gamut and better motion performance
Possibly yes.
So far, the answer anecdotally is no, at least not in situations where lit pixels are moved quickly into black areas. In practice, my obnoxious green text black background terminal was kind of gross to scroll, but haven't experimented much with others yet. Playing games has thus far been fine, scrolling in other contexts is fine for practical purposes. Happy to update this if you want after I ruin my new MacBook by experimenting more
Hmm.
I just set up a 4K terminal (542x143 chars) using the 'homebrew' theme (green on semi-transparent black) and did
prompt% ls -larS RemoteAstrophotography_com-M63-Stellina.zip | awk '{print $5}'
4514072533
prompt% cat RemoteAstrophotography_com-M51-Stellina.zip| base64
... and it is happily scrolling up the screen, lightning fast, way way too fast to read, and responding instantly to CTRL-Q/S. Seems ok to me.
The response being referred to here is the quickness of the pixels' ability to change.
Certainly a clever way to test it. In terms of response, I meant that at least on a fully black background, you should see ghost lines of text between where the line of text was and where it's going, almost like a mouse cursor with a trail
Was phosphor afterglow ever an issue with LCDs? Just wondering
LCD BLUs have a uniformly glowing background which is filtered by the LCD to make pixels. If there is delay in pixels updating, it would be the LCD causing it.
aw. of course. Backlight. Filtering. My brain was hallucinating UV emitting LCD with phosphors in place of filters, that's OLED... thanks.
Is there more to the thread or just this one tweet/X thing? Response times notoriously suck on MacBooks, it would be nice to see that remedied, anecdotally it doesn't seem like that's happened yet.
Edit: Nevermind, same tweet seems to have been quoted across a bunch of different other news sites. Apparently Blur Busters claims an improvement, I'll try it out and see how it is in some other contexts.
If you're not logged in to Xitter, navigating to a Xeet allows you to view the Xeet, but not the Xomments. Fortunately, there are open-source, self-hostable, privacy-preserving front-ends for Xitter, such as Nitter.
If setting it up yourself is too much work, you can use other public instances. One such instance is called xcancel. Load the Xeet as normal, then simply append "cancel" to the domain name before the period in your URL bar and hit enter :)
I thought nitter shut down because of the Xitter api changes.
There were some workarounds, most buried in here: https://github.com/zedeus/nitter/issues/983
No need for workarounds. Only the flagship nitter.net was blocked. Nitter is an open source project and maintains a list of working instances on its wiki: https://github.com/zedeus/nitter/wiki/Instances
I remember hearing something like "the way Nitter worked, guest accounts, was shut down and the 3 instances that remain work by using selfbotting" a while ago, it's logical although I haven't verified it
What’s selfbotting? Based on the name it sounds like something that requires me to surrender my own authentication token to some automation service… but that’s definitely not the case for these alternative Nitter frontends.
I’m not sure how Twitter ultimately blocked them. It would be pretty embarassing (for Twitter) if it were a simple IP block of the Nitter.net servers, but that doesn’t seem too out of whack with Musk’s history of litigating bot behavior…
[flagged]
Who manufactures their displays? I'm guessing they have more influence in the design or manufacturing than most players, but is this just a matter of them telling Samsung/LG/etc "ok, we're going to use your quantum dot displays now"?
They source from a combination of Samsung, LG, and BOE (Chinese display manufacturer). The way the arrangement typically works is that manufacturers will send Apple preproduction samples and Apple will decide which are worth using for upcoming SKUs. The manufacturer will build out production facilities to meet that demand and whatever specs Apple wants. Apple may also help with investment or R&D to develop products to meet feature roadmap targets and increase supplier competition. It's a very dangerous game for the manufacturers.
Ah yes, the classic disjointed graphs with unknown y-axis scales.
I love how apple has lots of these silent innovations. They work hard giving us great products.
I don’t care how much they improve the CPUs, not upgrading before OLED
is OLED unequivocally better than IPS?
I've heard that there are screen lifetime issues?
Also, from my limited experience with a single OLED screen, it seems that most stuff was created for a certain kind of screen without as much colour fidelity, and now that stuff seems far more...obnoxiously "saturated"?...on an OLED screen.
> I've heard that there are screen lifetime issues?
This has gotten much, much better, especially with "tandem OLED" where you just stack two of 'em on top of each other. It should be fine these days.
> Also, from my limited experience with a single OLED screen, it seems that most stuff was created for a certain kind of screen without as much colour fidelity, and now that stuff seems far more...obnoxiously "saturated"?...on an OLED screen.
That's up to the display manufacturer to calibrate the screen. The content should just be what it is and specify its colorspace properly. (Note, "properly" depends on the environment around you, so if you really care about this you have to participate too.)
Lots of devices that come with OLED displays come with a "vibrancy" mode turned on by default that oversaturates colors until you turn it off. It does look great at a glance tho!
Conversely lots of contents is produced on/for less-than-stellar displays and gamma+color-profiles-be-damned overcompensate with more saturated colours at the data level because it's going to show up toned down.
When a display is actually able to put out the colours it then looks gaudily oversaturated. I've had such problems already with non-OLED "somewhat† calibrated" good quality screens as well.
† I mean I did not calibrate them, they were factory calibrated with a good enough test curve slip in the package.
in addition to tandem OLED...
Pixel aperture ratio has increased drastically since the early displays. This drops current density for a given amount of light output, and there's a nonlinear relationship between current density and segregation so that helps a ton.
Deuterium helps make more light per unit current, improves current density, improves lifetime.
Microlensing of your customers will accept narrower viewing angle, improves brightness and lifetime in the same way.
There was a time that OLED problems were so huge Lenovo cancelled their usage on laptops for many years (e.g X1 Yoga Series). It was so bad that I got the next generation laptop for free when it was released.
OLED has had lower peak brightness than IPS. It may not be perceptually so because of no-backlight absolute blacks and higher contrast, but the difference starts to matter in broad daylight where OLED may not be bright enough, irrespective of matte vs glossy.
For what use case? Watching a movie on anything that isn’t OLED is a painful experience for me now, but coding backends on an IPS is perfectly fine.
the pixel response and contrast absolutely are. Battery life is a little worse (especially in bright mode). OLED pixel response is around 100 micro-seconds compared to ~5ms for IPS, and each pixel dims individually allowing for actually good HDR
I think so, more vivid colors and better blacks and viewing angles. At the risk of burnin.
I'm just worried about the burn in from coding
If you buy a macbook it's supposed to last a long time, but I'm kind of skeptical of getting one right when they release instead of a tried and tested IPS mbp
Make the screen a wear item then.
OLED has been better in every way except longevity since 2018.
6 years later I’m not buying a piece of garbage LCD for $6k
Does a bear relieve himself in the woods?
Doesn’t OLED pixel layout not line up with modern text rendering engines? At least that’s what I believe I’ve read from reports on banding around text on Windows in particular that makes long-running text work a problem.
Shouldn’t be an issue under macOS for the most part, which has used grayscale antialiasing for several years since subpixel AA isn’t of much benefit with HiDPI displays and complicates text rendering considerably.
If there are any problems, it’ll probably be with cross platform software that doesn't use native text rendering and assumes RGB subpixel arrangements instead of obeying the system.
That depends which OLED panel you're talking about; they're not all the same.
See reply to sibling
People read text on the OLED screen of the iPad Pro all the time.
Yeah think its a ClearType issue[0] specifically then
[0]https://github.com/microsoft/PowerToys/issues/25595
Freetype also doesn't do the right thing on RGBW at least. Reading code in the repo suggests it's possible to reconfigure it to work properly with the Harmony algorithm but I haven't worked out how yet. If anyone knows how to sponsor efforts to fix this I would totally contribute to that.
A separate problem is that I don't think there is a standard way for monitors to communicate the subpixel layout in such a way the font rendering engine will have access to it. That seems like a pretty big oversight when introducing these in the first place.
I can't imagine subpixel rendering is at all worth it on modern DPIs
[flagged]
I cannot fathom why someone would create an account to post something so inane, so I'm guessing this is a bad attempt at farming account reputation.
the stock price forces them to "innovate" lol
What if there was a place with all the zip of Nuka-Cola?
Wouldn’t that be the cheer-cheer-cheeriest place in all the world?
Where the rivers’ made of quantum and the mountaintops are fizz
With fun and games and rides for all the moms and pops and kids!
P.S. I have no idea what quantum or any of their terms mean. This is the closest that comes to my mind. Great marketing, I guess, or not.
It's actually a technical term, not marketing term this time :
https://en.m.wikipedia.org/wiki/Quantum_dot
High failure rate on those displays, calling it now.
interesting, We have a failed Samsung QD TV and when we called for a technical support the guy correctly guessed the screen issue as if he has seen that many times before. What makes them unreliable? The problem isn't even in the QD film itself but the LED array, an LED fails and shorts then the other LEDs start working at higher voltage and overheat and causing cascading effect where the problem starts as small and develops into unusable.