• 2 Posts
  • 72 Comments
Joined 2 years ago
cake
Cake day: August 4th, 2023

help-circle

  • Bookshop.org just recently added ebooks, and I believe they have a UK store, for anyone trying to buy ebooks in a more ethical way. It allows you to select a local bookstore of your choosing and support them when you purchase books. They take a small fee to cover their warehousing and shipping I think, but pass along a lot of the profit (80%) to the local bookstore. They’re a certified b corp and their bylaws say they can’t sell to a major retailer (eg amazon).


  • They don’t, but with quantization and distillation, as well as fancy use of fast ssd storage (they published a paper on this exact topic last year), you can get a really decent model to work on device. People are already doing this with things like OpenHermes and Mistral (given, 7B models, but I could easily see Apple doubling ram and optimizing models with the research paper I mentioned above, and getting 40B models running entirely locally). If the start of the network is good, a 40B model could take care of a vast majority of user Siri queries without ever reaching out to the server.

    For what it’s worth, according to their wwdc note, they’re basically trying to do this.





  • Absolutely. But if we can flip Texas from right and alt right to centrist then we may actually get progressive candidates in other areas (and frankly, if we flip Texas blue we will see a shift in policies from republicans to the left). And, perhaps by some miracle, we can get star or ranked choice voting, but that absolutely won’t happen while republicans are in control here.

    Here’s to a better Texas (lifts shiner [but prefers one of the many smaller microbrews here])


  • That wasn’t my immediate assumption. That was a conclusion drawn after you repeatedly stated that democrats were moving right and basically did nothing good. Which is fine, and I probably shouldn’t have assumed how you would vote, though given the environment these days it wasn’t too audacious of an assumption.

    By all means critique. But also please vote for the furthest left candidate that can win in every election you can vote in. Especially in Texas. This place needs so much damn help, and the Republican leadership definitely isn’t going to help (unless you’re ridiculously wealthy or own a large company). And get others to vote as well, because the only thing that will change Texas is to change the elected officials in charge.



  • AliasAKA@lemmy.worldtoLefty Memes@lemmy.dbzer0.comReminder...
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    9 months ago

    That’s not true at all. Biden specifically has protected more public spaces and land, while Trump specifically attempted to lease / sell / make available more of it to corporate interests. Net neutrality is being restored after it was rolled back under Ajit Pai. We can be frustrated democrats don’t do enough, or aren’t further left, but to say they keep the status quo at the regressive place republicans want to take us is demonstrably wrong. So while maybe they won’t expand affordable care beyond where it currently is, they’ll at least keep it where it is and restore it if possible. If they won’t add new parks, they at least protect the ones we have and cancel corporate interest on existing ones. If they won’t raise the taxes heavily on the rich (which is where I think they’re most guilty of “status quo”), they at least won’t give them trillions in tax breaks like Trump did.


  • It doesn’t lead us to the same place but slower, at least not everywhere. One party has rolled back abortion protections, equal rights protections, bans books, and a host of other regressive policies. Democrats didn’t do that. Democrats might keep status quo, but the Republican agenda is literally to move us backwards to a worse place (though if they wanna move us back to when the highest marginal tax rate was 90% I could be onboard with that part at least).









  • It may be no different than using Google as the search engine on safari, assuming I get an opt out. If it’s used for Siri interactions then that gets extremely tricky for one to verify that your interactions aren’t being used to inform adds and or train an LLM. Much harder to opt out vs default search engine there, perhaps.

    LLMs do not need terabytes of ram. Heck you can run quantized 7billion param models on 16gb or less (Bloom, Falcon7B — falcon outperforms models with higher memory by the way, so there’s room here for optimization). While not quite as good as openAIs offerings, they’re still quite good. There are Android phones with 24gb of ram so it’s quite possible for Apple to release an iPhone pro with that much, and run it similar to running any large language model on an M1 or M2 Mac. Hell you could probably fit an inference only model in less. Performance wouldn’t be blazing but depending on the task, it could absolutely be sufficient. With Apple MLX and Ferret coming online it’s totally possible that you could, basically today, have a reasonable LLM running on an iPhone 15 Pro. People run OpenHermes 7B for example which uses ~4.4GB to run, without those frameworks. Battery life does take a major hit, but to be honest I’m at a loss for what I need an LLM for on my phone anyways.

    Regardless, I want a local LLM or none at all.


  • This is a really bad look. It will probably be the case that it will be an opt in feature, and maybe Apple negotiates that Google gives them a model they house on premises and don’t send any data back on, but it’s getting very hard for Apple here to claim privacy and protection (and not that they do a particularly good job of that unless you stop all their telemetry).

    If an LLM is gonna be on a phone, it needs to be local. Local is really hard because the models are huge (even with quantization and other tricks). So this seems incredibly unlikely. Then it’s just “who do you trust to sell your data for ads more, Apple or Google?” To which I say neither, and pray Linux phones take off (yes yes I know root an Android and de google it but still).