• 1 Post
  • 76 Comments
Joined 4 years ago
cake
Cake day: March 23rd, 2020

help-circle



  • Almost certainly not, although fair disclaimer, I don’t actually know. Ads need to be tailored to the user when delivered, so it’s likely the YouTube frontend requesting the next chunk of video to be an ad instead of the next chunk of video from blob storage. yt-dlp likely just requests successive chunks straight from blob storage, passing this.

    If YouTube served ads by saying “point to an ad chunk next” in their blob storage, 1. Everyone would see the same ad and 2. Premium users would still see ads.

    To patch this, YouTube really needs to stop serving video chunks directly from storage, but I forget the reason they haven’t done that already.

    (Technical note; I’m assuming blob storage chunks contain 1-2 seconds of video and metadata pointing to the next one, like a linked list. I’m not sure if this is how YouTube works, but many video platforms do this)















  • OsrsNeedsF2P@lemmy.mltoProgrammer Humor@programming.devTough break, kid...
    link
    fedilink
    arrow-up
    15
    arrow-down
    22
    ·
    edit-2
    5 months ago

    Thinking AI is an upgrade from pencil to pen gives the impression that you spent zero effort incorporating it in your workflow, but still thinking you saw the whole payoff. Feels like watching my Dad using Eclipse for 20 years but never learning anything more complicated than having multiple tabs.

    For anyone who wants to augment their coding ability, I recommend reading how GPT (and other LLMs) work: https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/

    With that in mind, work on your prompting skills and give it a shot. Here are some things I’ve had immense success using GPT for:

    • Refactoring code
    • Turning code “pure” so it can be unit-testable
    • Transpiling code between languages
    • Slapping together frontends and backends in frameworks I’m only somewhat familiar with in days instead of weeks

    I know in advance someone will tunnel vision on that last point and say “this is why AI bad”, so I will kindly remind you the alternative is doing the same thing by hand… In weeks instead of days. No, you don’t learn significantly more doing it by hand (in fact when accounting for speed, I would argue you learn less).

    In general, the biggest tip I have for using LLM models is 1. They’re only as smart as you are. Get them to do simple tasks that are time consuming but you can easily verify; 2. They forget and hallucinate a lot. Do not give them more than 100 lines of code per chat session if you require high reliability.

    Things I’ve had immense success using Copilot for (although I cancelled my Copilot subscription last year, I’m going to switch to this when it comes out: https://github.com/carlrobertoh/CodeGPT/pull/333)

    • Adding tonnes of unit tests
    • Making helper functions instantly
    • Basically anything autocomplete does, but on steroids

    One thing I’m not getting into on this comment is licensing/morals, because it’s not relevant to the OP. If you have any questions/debate for this info though, I’ll read and reply in the morning.


  • OsrsNeedsF2P@lemmy.mltoProgrammer Humor@programming.devTough break, kid...
    link
    fedilink
    arrow-up
    9
    arrow-down
    53
    ·
    edit-2
    5 months ago

    Using an IDE isn’t programming either

    But I’ll definitely prefer hiring someone who does. Sure, you can code in Vi without plugins, but why? Leave your elitism at home. We have deadlines and money to make.

    Edit: The discussions I’ve had about AI here on Lemmy and Hackernews have seriously made me consider asking whether or not the candidate uses AI tools as an interview question, with the only correct answer a variation of “Yes I do”.

    Boomer seniors scared of new tools is why Oracle is still around. I don’t want any of those on my team.