![](/static/61a827a1/assets/icons/icon-96x96.png)
![](https://lemmy.ca/pictrs/image/7b0211f0-7266-4e13-9d26-8c3e6126af62.png)
Does anyone else think that the scenes where RTX is on look over or under exposed depending on the lighting? I get that in real life details will be lost in a dark or bright room but this seems to be a bit extreme.
Does anyone else think that the scenes where RTX is on look over or under exposed depending on the lighting? I get that in real life details will be lost in a dark or bright room but this seems to be a bit extreme.
I thought McLaren had the record for this season at 1.8 seconds.
Agreed. There is plenty of time between now and 2026 for corporate sentiment and priorities to change.
I do hope they’ll take this project seriously and that we get some real competition from them as a team.
While it’s unfortunate that it came at the expense of Danny getting injured, I’m excited to see how Lawson does in qualifying and the race.
I’d investigate the differences between the installs particularly around graphics and power management. It sounds like your system is getting woken up but it’s hanging at some stage in the process of resuming. You might get lucky and the issue might show up in the logs if you’re willing to investigate them.
When I’ve run NUCs in the past I’ve had issues with external nvidia GPUs dropping off the bus when resuming from suspend. To “fix” the issue, I ended up limiting the power state to S2 or S3 so that the graphics card was kept on the bus.
Do you know what display server, DE, power management service you were running on both? If the logs don’t turn anything up you can always compare the configs too to see how they’re suspending/waking the system.
I wonder if the scenario with spoken vs printed words getting treated differently is due to the differences in accuracy of google’s audio and ocr technology. Hi-res text images makes ocr very good at deciphering between grape and rape but with audio it may not be as good.
Similarly, I wonder if the fact that google is autogenerating subtitles for videos makes a difference. When it’s spoken in a video it’s not something they’ve produced but when it’s in subtitles they have generated it is something they produced and could somehow open themselves up to legal issues? Regardless it’s still unfortunate that YouTube is forcibly censoring subtitles.
Providing more specific information about what isn’t working would help resolve the issue.
The flathub page lists some additional steps that need to be performed to allow virtual input. Did you do this?