0 sats \ 0 replies \ @MattAhlborg0 OP 1 May \ parent \ on: PayPerQ (PPQ.AI) AI chatbot now has Dalle3, GPT Vision, Claude3, and more! bitcoin
These are great, great notes told from my target customer perspective, the "AI newby".
Regarding Alex Epstein idea, yes this is very possible and something we are ideating on how to implement perfectly.
Thanks for the bug and UX reports about new chats, deletion, etc had no idea about those.
Prompts are indeed confusing! Example page would be great just need to get to it. I would like first build in some detection for when people are actually trying to use it. You are the first that has even bothered to talk about it.
Yes, we do allow you to go negative, but only for one query. After that you will need to pay up again! We will probably refine it so that you can't go negative at all in the future. It was just kind of an easy way to implement because the cost of each query is so variable we don't really know if your next query is going to cause you to go neg.
Thank you again for these amazing and passionate notes! Please let us know if/when you have other feedback. Follow me on twitter @mattahlborg as that is where I'm most active if you want to connect further!
I understand your flow now. You came to website and changed from the default model of gpt4 turbo to llama before you submitted your first query. Then the payment modal came up with the 8-10 queries sentence which seemed expensive to you.
We will definitely be revamping the initial payment modal because it is very confusing to a lot of people.
Thanks for working through this with me.
If you select Llama from the dropdown and run some queries, you will see in the "Account Activity" section the actual price you paid. Should be 1-3 sats usually. The 25 cents is just a one time deposit to buy a bunch of credits. After that payment you are then drawing down upon that 25 cents over time. You can set it to 5 cents too if you want and that should still buy you quite a few Llama queries.
Ah, I think I figured out the confusion. You saw the "25 cents pays for 8-10 queries" and thought that applied to Meta Llama 3. The 8-10 is in reference to GPT4 Turbo. You will get like over 100 queries with Meta Llama 3.
Yea we need to do better explaining these things to people.
Thanks! The model selection is awesome but definitely getting to be overwhelming for the normal user, and we do need to convey pricing better somehow.
I've thought a lot about setting a fixed price but it gets tricky as then some power users start running queries which demand a lot of compute, and that pushes the fixed price even higher. Some platforms compensate for this by taking shortcuts behind the scenes that users don't see, but ultimately it leads to kneecapped AI outputs.
The highest quartile of our users still don't spend half of what a subscription costs on average, but that doesn't change the fact that people might still have the pricing anxiety that you are speaking of. So it's something I definitely need to ponder on more.
Appreciate the feedback.
We use openrouter as our supplier for Llama3-70B and their price is $0.27/M and we tack on a margin after that and round up to the nearest sat. Ultimately though llama3 queries our platform rarely exceed a few sats (2/10ths of a penny) so they aren't exactly breaking the bank lol.
I guess if you are plugging in hundreds of thousands of tokens of context it starts to matter but for 99% of normal users this is incredibly cheap for value you are getting from AI.
I think you may be misunderstanding the pricing? The default invoice of 25 cents pays for many queries, not just one. The margin is actually very low currently compared to API.
Was this possibly what happened? I think maybe we need to make the UX much more clear on what 25 cents gets you because some other users have also voiced this. Let me know please.
Yea I'm sorry about that. The mobile is indeed having some hard to pin down issues and we are looking at it. I know you didn't ask, but let me know if you want your sats back.
Also, if you want to do us a favor and walk us through exactly what happened in our Telegram it would help us a lot. One of the problems with "accountless" is its hard to get feedback from customers when things go wrong. They just kinda silently leave.
Hey, I believe I saw you running a bunch of queries through PPQ. I was wondering if we could chat offline about the UX you had? I'm on TG @mattius459
Give PPQ.AI a try. Lmk if you have any questions Kevin. Be happy to explain AI trends on a call as well.
I'm trying to turn it into an actual business, yea. Yea there are a lot of LLM wrappers out there with more resources than me (poe, typingmind, you.com, others), but I want mine to at least keep up with those and be the best one that operates on "accountless", "pay-per-usage" model via bitcoin payments.
You forgot PPQ.AI!
The main contrasting points between PPQ.AI and the others are:
- Likely the best UI/UX with all of the nice little features you'd normally see on ChatGPT like markdown, copy/pasting code, input editing, folder organization, light/dark mode, keyword search, context adjustment, prompt generation, and other little things you only notice when you use AI every day.
- "accountless" model with no login or password
- Access to premium models like GPT4 and Claude 3
Disclaimer: I built PPQ.AI!