Sony A95L QD-OLED (New for 2023)

Blackstone

2[H]4U
Joined
Mar 8, 2007
Messages
3,580
Looks like this is the one for my family room. Reviewer’s are saying it gets closer to the reference mastering monitors than any TV.

Bummed by only 2 HDMI inputs, but this is clearly the image quality on the market for film.
 
I just wish Sony didn't charge such a high "Sony Tax" on their shit. I looked at their OLED when I got my S95B and I did like it better... but I just couldn't justify the $1000 extra over the Samsung.
 
I just wish Sony didn't charge such a high "Sony Tax" on their shit. I looked at their OLED when I got my S95B and I did like it better... but I just couldn't justify the $1000 extra over the Samsung.
I hear ya. You are paying for better processing, reflection handling, ect. For gaming it is overkill but for a home theater that performance is essential to me.
 
I hear ya. You are paying for better processing, reflection handling, ect. For gaming it is overkill but for a home theater that performance is essential to me.
I mean I'd like it for gaming too. A big complaint I have with the S95B is it doesn't properly do HGIG, and a lesser one is its PQ tracking isn't great. It matters for games, since generally you want the tone mapping done by the game engine, not by the display. I didn't expect to like my MiniLED monitor as much as I do, and part of the reason is that it does proper PQ tracking the whole range through (other part is the brightness).

Who knows when I'll get my next TV but I'll certainly look at Sony again.. but man the Sony tax is a lot.
 
I just wish Sony didn't charge such a high "Sony Tax" on their shit. I looked at their OLED when I got my S95B and I did like it better... but I just couldn't justify the $1000 extra over the Samsung.
I hear you. However maybe I'm just an old fart right now but I remember the old Pioneer plasmas in the late 90s costing $40,000. The fact that you can get this kind of picture for $5000 and have it be 30~ inches bigger is just amazing.
 
I hear you. However maybe I'm just an old fart right now but I remember the old Pioneer plasmas in the late 90s costing $40,000. The fact that you can get this kind of picture for $5000 and have it be 30~ inches bigger is just amazing.
Oh don't get me wrong, things are getting so much better. There was a time when high-end video was just completely out of my range. Not only did I not make the money I do now, but even if I did it was just in a category I couldn't afford. I am exceedingly happy with the displays we can get. Heck even low-end LCDs are amazing compared to the garbage that was cheap CRTs from when I was a kid.

I am just annoyed with the amount Sony charges. I like their stuff, and I'd pay more for it, but they charge a LOT more and I so far haven't been able to bring myself to pay it.

Now I dunno what it'll be like when I get my next TV. The S95B is working good now and hopefully will last quite a while. Hard to say when I'll be in the market and what that market will look like. But currently it is the same ~$1000 Sony tax between the S95C and the A95L. That's a lot when you are talking about a $2500 display (in the case of the 65" S95C). Kinda hard to justify a 40% price increase for better video processing. Now if they charged like a $300 premium? Hell ya I'd do that.
 
They're not the nicest when it comes to warranties either, in terms of honoring it. I've heard of quite a few people who basically had a TV/monitor/you name it go out on them and Sony more or less implied it was their fault.
 
This TV is now generally considered to be “the best consumer display EVER made.” A lot more reviews are out now and the reviewers are BLOWN away!

It is as close to a $30,000 reference monitor as you can get, period. There does not seem to be any display on the market with better performance. The brightness increase over last years model is significant. Gaming features could be a bit better but in terms of raw image quality, this is as good as it gets in 2023. Stoked!

The reviewers are drooling over this thing.

This will be my first experience with 4K gaming, coming from a 1080P plasma. I’ll post pictures. I don’t think I’ll have it until November.

Of course now that I’m at 4K I’ll need a new PC I’m sure. Hopefully my 3090 will hold up, but my CPU I’m guessing will cry out in pain.

I’ll do a new build when intel launches its next CPU next year and probably go to a 5090 when it launches. My gear is ancient but I’ve been waiting for this OLED moment.
 
Last edited:
I hear you. However maybe I'm just an old fart right now but I remember the old Pioneer plasmas in the late 90s costing $40,000. The fact that you can get this kind of picture for $5000 and have it be 30~ inches bigger is just amazing.
Exactly, yes and a 720P Pioneer Kuro at like 55 inches was like $7,000.

I’d rather plunk down for the best of the best and roll it for a decade.
 
I should add, this set will likely see significant discounting in 6-8 months like everything else. I got $400 off already.
 
I purchased one of these and picked it up back in early Sept, had to drive to BB warehouse to pick it up.

The reports are real, this display is the best I have ever seen, also own an s95b, a80j, LG c1.

Let me know if you guys have any questions, to me the Sony tax is totally worth it.
 
I purchased one of these and picked it up back in early Sept, had to drive to BB warehouse to pick it up.

The reports are real, this display is the best I have ever seen, also own an s95b, a80j, LG c1.

Let me know if you guys have any questions, to me the Sony tax is totally worth it.
What size? Is VRR working?
 
This thing is coming out so late in the year it has to be Sony's 2024 model too, right?
I just can't see them announcing something new at CES.
 
From the Rtings discussion thread:
  1. Tone mapping (both Gradation Preferred and Brightness Preferred) can’t handle very bright highlights and fills them with with blue/grey colour instead. Example: the shine on the car at the beginning of Mad Max: Fury Road. Confirmed by several owners, me included.
  2. Dolby Vision Dark has multiple bugs discovered and reported by Classy Tech.
  3. There is a pink tint to low luminance colours. Visible in Ahsoka, Episode 1 between like 8:09 and 11:30.
  4. Many owners have reported issues with audio delay, random restarts, choppy video, etc.
  5. My set and many others (incl. Classy Tech’s, AVForums’ and Vincent’s) have faint dark streaks across the panels.
Do you even test things like these if they don’t fall in your fixed categories?

It was reported that Sony has asked for a delay in reviews until a new firmware is released in early November (which is probably why Vincent from HDTVtest still hasn’t posted his).

UPDATE: Vincent posted his review, confirming DV issues, low luminance issues, excessive audio delay, dark streaks across the panel, etc.

Vincent's review:

View: https://www.youtube.com/watch?v=PgNuuQgE7Tk
 
Everything about displays and TVs makes me think that making them does not attract the best programmers. So, so many bugs. I get that some of this stuff is very complicated, but when they get even fairly simple stuff wrong...

Maybe it's the constant yearly release cycle of new models that makes it impossible to develop these in a sensible way.
 
From the Rtings discussion thread:
  1. Tone mapping (both Gradation Preferred and Brightness Preferred) can’t handle very bright highlights and fills them with with blue/grey colour instead. Example: the shine on the car at the beginning of Mad Max: Fury Road. Confirmed by several owners, me included.
  2. Dolby Vision Dark has multiple bugs discovered and reported by Classy Tech.
  3. There is a pink tint to low luminance colours. Visible in Ahsoka, Episode 1 between like 8:09 and 11:30.
  4. Many owners have reported issues with audio delay, random restarts, choppy video, etc.
  5. My set and many others (incl. Classy Tech’s, AVForums’ and Vincent’s) have faint dark streaks across the panels.
Do you even test things like these if they don’t fall in your fixed categories?

It was reported that Sony has asked for a delay in reviews until a new firmware is released in early November (which is probably why Vincent from HDTVtest still hasn’t posted his).

UPDATE: Vincent posted his review, confirming DV issues, low luminance issues, excessive audio delay, dark streaks across the panel, etc.

Vincent's review:

View: https://www.youtube.com/watch?v=PgNuuQgE7Tk

Armenius, issues aside, how is it?
 
Ugh. Had no idea premium TVs were still using 100Mbps ethernet. What year is this? So stupid. Probably saving $5 per unit sold. Cheap bastards, all of them.
If the intent is for streaming then surely 100 Mbps is plenty? What am I missing here? Surely firmware downloads aren't THAT large?
 
If the intent is for streaming then surely 100 Mbps is plenty? What am I missing here? Surely firmware downloads aren't THAT large?
Has nothing to do with firmware downloads. Watch the video above for context.

Some streaming services require more than 100 Mbps to get max quality. The irony here is that Sony Bravia Core is one such service (or so they advertise).

Then there's DIY streaming such as Plex, etc. 4k Blu-Ray can peak over 100 Mbps easily unless you transcode.

Not everyone favors a WiFi setup over wired.
 
Has nothing to do with firmware downloads. Watch the video above for context.

Some streaming services require more than 100 Mbps to get max quality. The irony here is that Sony Bravia Core is one such service (or so they advertise).

Then there's DIY streaming such as Plex, etc. 4k Blu-Ray can peak over 100 Mbps easily unless you transcode.

Not everyone favors a WiFi setup over wired.
Okay I had no idea. Wow! Noted then. And yes that’s hilarious about the Sony streaming service being too much for the Ethernet port. Lol
 
Finally watched Vincent's review. Have to say it's a shame these TV's can't BFI at 120hz. That being said, 60hz providing 1080 lines of motion resolution is great. That puts it on par with the best plasma screens. Speaking of plasma, I find it interesting that it still more-or-less dominates OLED at motion at 24hz. Whereas OLED still relies on motion interpolation to handle the low framerate, plasma didn't need it. 24hz on my mid-range Samsung plasma looks fantastic and smooth and doesn't use motion interpolation. It just ups the framerate to 96hz and displays multiple frames. I'm guessing the slower phosphor decay of the plasma screen makes it look more natural than the quick on/off of the OLED pixels.

In any case, I'm glad that when my TV finally bites the dust, we have an emissive tech that is better than it in almost every way, and in the ways it doesn't completely trounce plasma, is at least its equal. I feel like for the first time in forever, we haven't actually taken any major steps back from last generation's premium display tech.
 
Yes actually my primary screen even for my PC is a 2013 Panasonic VT60 plasma and the primary reason I have not upgraded it is because of the impeccable reference level motion handling of older content. I could have upgraded it at any times. Games also look beyond mind blowing, but alas, no VRR or HDR.

I basically skipped all LED based tech released after 2007 waiting for a display like this.

While I don’t expect this new set to quite top the VT60 on motion, word is it has an incredible ability to upscale older content and low bit content and the motion handling is the best yet on OLED. So I am willing to make the tradeoff.

If it wasn’t for the superior motion handling, I probably would not be spending $5k on a TV.

Also, I have a certain a certain TV channel’s banner good and baked into my plasma and I’m sick of it after 10 years!
Mine has burn-in too. :( Not terrible but you notice it if you look for it. Having small children means I'm not dropping that kind of change on... any electronics actually :D. But yeah, they would accidentally destroy an OLED in no time flat.
 
With inferior processing, motion handling and image quality. I wouldn’t turn my nose up to a G3, however. It just doesn’t quite have the processing chops for a serious theater buff.
That has to do with the processor and firmware, not the HDMI inputs. I'm not doubting Sony's ability to code fantastic image processing in their televisions, but that is no excuse for the low number of HDMI inputs, which they say is a "cost" issue. Personally, I need 4 inputs (PC, Xbox Series X, PlayStation 5, UHD Blu-Ray player).
 
That has to do with the processor and firmware, not the HDMI inputs. I'm not doubting Sony's ability to code fantastic image processing in their televisions, but that is no excuse for the low number of HDMI inputs, which they say is a "cost" issue. Personally, I need 4 inputs (PC, Xbox Series X, PlayStation 5, UHD Blu-Ray player).
While I'd argue that you should use a receiver to plug in and switch all those devices to give you the good, booming, sound they deserve... I'll agree that the cost argument is fucking dumb coming from a TV this expensive. Like I totally get the cheaper argument for budget displays. If you are trying to keep the cost down, I can see having less ports, and less of thing being 2.1 as it DOES cost money for those high speed interconnects. But for something that Sony wants this much money for? STFU and put in a chip that can handle all 4 at 48gbps.
 
They claim they’d need two Pentonic SOC chips for that. Not sure it is their fault. I think it has more to do with the Pentonic not supporting it and they need that processor to hit their image quality target. I’m NOT happy about the lack of ports, but it is what it is.
They could just put a switcher on it. TVs have done that before, they just get an HDMI switch and put it in there and hook it to the one high speed port. Still costs, but not that much.
 
Some streaming services require more than 100 Mbps to get max quality. The irony here is that Sony Bravia Core is one such service (or so they advertise).
They say you need to use Wifi for it... to point out the strangeness:
access Pure Stream™ at 30Mbps, you must have a minimum internet speed of 43Mbps. To access highest quality Pure Stream™ available at 80Mbps you must have a minimum internet speed of 115Mbps. To enjoy 80 Mbps with Pure Stream™ functionality, you need to connect to the Internet via Wi-Fi (wireless LAN) that supports IEEE 802.11 n/ac/ax(*2). BRAVIA CORE
 
Again, a simple ethernet to usb adapter allows unlimited wired speed…
Which again show the strangeness, it is not because the motherboard-chips are too limited for that speed it seem.

A "smart"device instead of using your TV will also be gigabit if it has gigabit and can be really cheap, it is not an issue hard to go around, just one that seem strange.

One can assume that with good wifi available, wired TV are not popular among customer, even at that level of TV....
 
Last edited:
What I found most surprising is from my viewing distance the XR cognitive processor also does some noticeable improvements to 4K streaming content, not just 1080p. It makes 4K look like better 4K. You'll see it in human hair, eyes, and if there's any minor blurring around the face from a further out shot, it will crisp that up. It's not just a sharpening effect either. I recommend putting reality creation on manual and 50%. That adds the effect without blowing out harming the image.

I sit 8 feet from the 77 inch.
 
Come on Sony (Samsung display) you know you want to make a 42" version. I personally would buy two at least.
That only comes to their "90" series. I haven't heard any word about A90L though. The A90K is still their top for the series.
However I'd say for most though the C3 is the better option over the A90K. A90L would have to offer a lot of these improvements for it to be worth it to stick to Sony's premium without any other benefit.
 
Back
Top