[0.0] Sorry, I haven't been hosting for a while, guys. [1.95] [1.95] I was busy growing this mustache. [6.12] [6.97] I'm uncomfortable. [8.22] [8.53] Reviews are up for the GeForce RTX 5080, [11.27] [11.27] NVIDIA's latest AI product, formerly known as a GPU, [15.55] [15.55] and for many, including Linus Tech Tip Sebastian, it's falling a bit flat. [20.19] [20.19] In native rendering performance at 1440p, which is a lot of P [24.5] [24.5] the RTX 5080 is roughly 10% better than the 4080 Super [28.2] [28.2] and 7% better than AMD's RX 7900 XTX. [32.67] [32.67] Those leads increased to 20% and 10% respectively at 4K [36.78] [37.03] but the 5080 is still well behind the 4090 [39.79] [39.79] thanks in no small part to NVIDIA keeping the VRAM limited to 16 little gigabytes [44.81] [44.81] while the 4090 has 24 gigabytes, and it does not like to share [49.77] [49.77] This release is disappointing to enthusiasts [51.32] [51.32] who remember that the RTX 4080 outperformed the 3090 and 3090Ti at launch [57.23] [57.23] Although NVIDIA gave it a $1,200 price tag, the 5080 only costs $1,000 [61.51] [61.51] so it's missing a whole $200 worth of GPU juice, apparently [65.05] [65.05] But no matter how mid reviewers say the RTX 50 series cards are so far [69.24] [69.24] they're still gonna be hard to find [70.85] [70.85] NVIDIA has warned of availability issues to significant demand [74.73] [74.73] probably because people just love AI so much and the bubble is definitely not popping [79.09] [79.09] Can't do that. [79.38] [79.38] I didn't hear anything. [80.61] [80.61] It's solid. [81.71] [81.71] I'll be over here AI-ing, guys. Speaking of which, [83.95] [83.95] there's a fresh new Chinese AI model to give US investors more panic attacks [88.34] [88.57] E-commerce giant Alibaba has released Qwen2.5-Max [92.85] [92.85] which the company says outperforms DeepSeek V3, although it can't be run locally [98.09] [98.09] How are they doing this? [99.43] [99.43] Well, after DeepSeek's own model indicated to Redditors [102.66] [102.66] that it sometimes confuses itself with ChatGPT [105.65] [105.65] OpenAI told the Financial Times [107.58] [107.58] it has evidence that DeepSeek trained models on data generated by ChatGPT, [112.47] [112.47] which was famously trained only on original text handwritten by Sam Altman. [116.77] [116.77] The manifesto, we call it. [117.9] [117.9] It's artisanal. [119.07] [119.07] Microsoft says they're investigating these claims. [121.23] [121.23] In case you forgot, they're best friends with OpenAI, [123.55] [123.55] which we know because thanks to social media, [125.65] [125.65] we can follow the friendships and rivalries of tech CEOs like they're Minecraft YouTubers. [130.33] [130.33] Now, as we said on Monday, [131.99] [131.99] the US stock market panicked in response to the release of DeepSeek AI's models. [136.59] [136.59] But does that make sense? I mean, [138.2] [138.2] there are some reports that while DeepSeek's chatbots were trained on NVIDIA GPUs [142.7] [142.7] when you use one on the web, now it's running on AI chips made by Huawei [147.14] [147.14] which would give US investors some reason to be worried about NVIDIA's monopoly. [151.73] [151.73] However, whether these Chinese models are actually as cheap to train and use as DeepSeek claims [156.32] [156.32] is being debated by analysts. [158.45] [158.45] And Anthropic CEO Dario Amodei argues [160.4] [160.4] the fact Chinese companies have to turn to less powerful hardware is proof [164.5] [164.5] that American restrictions on the export of AI chips are working. [168.33] [168.33] But let's say, sure, DeepSeek is way more efficient, [171.29] [171.29] as explained by Sam Altman, that doesn't mean AI companies are going to buy less hardware. [176.78] [177.11] $500 billion server, we can do 250. [180.03] [180.03] Keep it coming. [180.61] [180.61] And it doesn't mean that you shouldn't check out our sponsor, [183.67] [183.67] the Drop BMR-1 V2 near-field monitors. [187.03] [187.03] They're compact speakers [188.04] [188.04] that can be customized [188.96] [188.96] with Drop BMR-1 grills in laser purple, [192.92] [193.23] shin-eye sage, [194.43] [194.43] and white. [196.93] [196.93] Minimalism. [197.59] [197.59] But despite their small size, [199.21] [199.21] they use balanced mode radiation, or BMR, [202.13] [202.13] drivers to deliver powerful full range audio [205.0] [205.0] while running 25% cooler than normal. [207.12] [207.12] And that's a lot better than its predecessor. [207.91] [207.91] Plus, [208.47] [208.47] its Bluetooth capability has been updated [210.28] [210.28] to allow several devices to be stored, [212.15] [212.15] and its input jack has been redesigned [214.3] [214.3] with automatic muting to stop [216.38] [216.38] those pesky plug insertion noises. [219.09] [219.09] Purchase the Drop BMR-1 V2 near-field monitors [221.9] [180.61] [CUT] [223.49] [221.9] at the link in the description. [223.49] [223.49] I thought that was all the stories, [224.74] [224.74] but it turns out I saved a quick bit in my mustache. [227.18] [227.61] Mustache, mustache. [228.71] [228.71] Enough. [229.53] [229.53] Google has open-sourced Pebble OS, [231.71] [231.71] the operating system that powered Pebble smartwatches, [234.28] [234.28] which you may not be old enough to remember. [236.0] [236.0] They were bought by Fitbit, which was bought by Google in 2021. [239.79] [239.79] Well, the original founder of Pebble and Super Giga Chad, Eric Megakovsky, [244.39] [244.39] says the open-sourcing means he's bringing Pebble back. [247.81] [247.81] His new team is working on an open-source smartwatch [250.06] [250.06] with a focused core set of features that users can tinker with [252.95] [252.95] so they don't have to depend on companies to fix stuff, [255.63] [255.63] like the blue triangle of death that briefly afflicted Garmin wearables this week. [259.73] [259.73] I mean, blue triangle? That's not even a thing. [262.17] [262.17] They made it up. [263.27] [263.27] What's next, pink tetrahedrons? [265.67] [265.95] US President Donald Trump said in a speech on Monday [268.24] [268.24] that his government will place TARIFFS on the import of computer chips and semiconductors [274.63] [274.63] to return production of these essential goods to the United States. [278.18] [278.47] The Biden administration's CHIPS Act tried to do this [281.18] [281.18] by planning to invest $52 billion in domestic chip foundries, [285.05] [285.05] but Trump says they had it the wrong way around. [288.25] [288.25] Chip companies don't need money. [290.01] [290.01] They need an incentive to not pay what Trump says could be a 25, 50, even 100% tax. [297.33] [297.33] Wow, it's like he's here. [298.63] [298.63] And that's why the U.S. is collaborating with every country in the world [302.54] [302.54] to make sure they all place those tariffs, too. [305.27] [305.27] That way, TSMC will have no choice. [307.87] [307.87] It's foolproof. [309.37] [309.37] Comcast is rolling out new tech in select cities that it says could reduce latency by 78% [315.02] [315.02] in what the telecom giant calls an ultra-low-lag connectivity experience. [319.71] [319.71] They put experience on the end. It makes it funny. [321.45] [321.45] The experience uses a tech standard that's been in the works for a while called L4S, [325.81] [325.81] which stands for low latency, low loss, scalable throughput, [329.25] [329.25] girl boss. [330.39] [330.39] L4S sounds like a Craigslist list. [332.29] [332.29] You know, I don't, you tell me in the comments what it means. [335.21] [335.21] I'll give you an S, eh? [336.49] [336.49] I'm an L, I'm looking for an S. [338.01] [338.01] If Comcast claims are true, this might make you happier with your existing internet, which is good, [341.91] [341.91] because the new chair of the FCC is killing his predecessor's proposal, [345.77] [345.77] making it easier for renters to switch internet service providers. [348.94] [349.29] Love the one you're with? Is that so hard? [352.12] [352.12] No one commits anymore. [353.61] [353.61] And a coder has published a new open-source tarpit tool [356.77] [356.77] that tackles the problem of AI-training web crawlers hogging websites' resources [361.08] [361.08] by trapping those web crawlers in an infinite, randomly-generated maze of linked pages. [366.21] [366.21] The tool is called Nepenthes, [368.43] [368.43] as in the biological name for a genus of carnivorous plants. [372.35] [372.35] The tool is just like those, except while tropical insects are scary and huge, [376.27] [376.27] they're not known for hitting the same servers a million times in 24 hours. [379.63] [379.63] Although whoever wrote this has clearly never been to Botswana. [383.47] [383.47] No botflies, though. [384.59] [384.59] Just keep your carapaces out of my skin. [387.01] [387.01] I just really want to reference these bot flies. [388.86]