I just got usage capped on GPT-4 after 20 messages – when I clicked “Learn More” on the message, I saw:

Thanks for your interest in GPT-4!

To give every Plus user a chance to try the model, we’re currently dynamically adjusting usage caps for GPT-4 as we learn more about demand and system performance.

We’re also actively exploring ways for ChatGPT Plus subscribers to use GPT-4 in a less constrained manner; this may be in the form of a new subscription level for higher-level GPT-4 usage, or something else.

Please fill out this form if you’d like to stay posted.

Now admittedly I paste massive chunks of code into GPT-4 as part of my daily workflow and it’s understandable if they’re wanting to make the amount users get match with the price they’re paying… but I was still a little taken aback by the customer-facing bullshit of that whole “To give every Plus user a chance to try the model” and “as we learn more”. Like bro if you feel like setting a limit based on use then just tell me what the limit is and how I can get more if I need it.

Anyone else run into this? Anyone have a good alternative (besides just sending it all to the platform API and paying out the ass)? GPT-4 is actually capable with code in my experience in a way that 3.5 and Copilot are not.

  • mozz@mbin.grits.devOP
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    7 months ago

    This morning was 177kb in and out, so call it 2/3 of it is input and 1/3 output, would mean roughly:

    118k bytes input ≈ 29k tokens = 29 cents 59k bytes output ≈ 15k tokens = 45 cents

    I think you may be correct in your assessment