Back to white papers
White Paper

The AI Buyer's Checklist: 23 Questions Before You Spend a Dollar

Jake McCluskey
The AI Buyer's Checklist: 23 Questions Before You Spend a Dollar

I've watched hundreds of small businesses buy AI the wrong way. They get excited by a demo, hand over a credit card, and six months later they're paying a monthly fee for a tool nobody uses. This checklist is the conversation I wish every owner had with their vendor before signing anything. I'm Jake McCluskey, and after 25 years in digital marketing and 500+ client engagements, I can tell you the cost of a bad AI purchase is rarely just the invoice. It's the lost quarter, the team frustration, and the trust that gets harder to rebuild the next time someone pitches you a tool.

If you bring these 23 questions to your next vendor meeting, you'll change the power dynamic in the room. Good vendors will welcome the questions. Weak ones will squirm. That alone tells you most of what you need to know.

What are the right questions to ask about outcomes?

Start with outcomes. A vendor who can't describe what success looks like in your business, in your numbers, is selling a tool, not a result. Before you talk pricing, you need to pin down what actually changes if the thing works.

Here are the five outcome questions:

  1. What specific metric will this improve in my business?
  2. How will we measure the before and after?
  3. What's a realistic range of improvement for a company my size?
  4. How long until I should expect to see results?
  5. If we don't hit those numbers, what happens?

The first question flushes out vague pitches fast. If a vendor answers "productivity" or "efficiency" without naming a number, you don't have a buying decision, you have a brochure. Good vendors will say something like "response time on inbound leads drops from 4 hours to under 10 minutes, and close rate on those leads typically moves 8 to 15 percent."

Question two, about measurement, is where a lot of vendors get caught. If they can't tell you how you'll know the tool is working, it's because nobody else is measuring it either, and that's usually because the results don't hold up under a microscope. Ask them what dashboards you'll have, how often the data updates, and whether you can pull it into your own reporting without asking permission.

Question five is the one most buyers skip. Ask it anyway. You're not trying to trap the vendor. You're finding out whether they stand behind their pitch or whether they vanish after the implementation call. The best answer I've heard was "if you don't hit the numbers we projected in 60 days, we'll work at cost until you do, or we'll refund the implementation fee." That's a vendor who believes their own pitch.

What should I ask about data and ownership?

Your data is the asset. If you don't get this part right, you can end up training a competitor's model for free. The five data questions protect you from that.

  1. Where does my data live, physically and logically?
  2. Who owns the outputs, me or the vendor?
  3. Is my data used to train any shared model?
  4. Can I export everything in a standard format if I leave?
  5. What happens to my data 30 days after I cancel?

On question three, read the terms. A lot of tools default to "we may use de-identified data to improve our services," which in practice means your patterns feed the machine. Sometimes that's fine. Sometimes it's a knife in the back of your business, especially if you have proprietary process knowledge in what you're uploading.

Export is the one that surprises people. Plenty of AI tools are happy to ingest your data but allergic to giving it back in a clean format. Test this before you sign. Ask for a sample export of a real account. If the vendor stalls, you have your answer.

Question five, about data retention after cancellation, matters more than it sounds. Some vendors hold your data hostage for 60 to 90 days after cancellation, hoping you'll reconsider. Others delete it in 24 hours. Neither is inherently wrong, but you want to know which one you're dealing with before a breakup, not after. Ask for the specific retention window in writing, and ask whether you can request immediate deletion if you choose to.

What are the right cost questions to ask an AI vendor?

Pricing is where most small businesses get quietly bled. The license fee is almost never the total cost. Ask the four cost questions and do the math yourself.

  1. What's the fully loaded annual cost, including implementation, usage, and support?
  2. How does pricing scale as my team or usage grows?
  3. What are the realistic implementation hours from my side?
  4. Are there any per-seat, per-query, or per-API fees that are metered?

A common pattern: the sticker price is $499 a month, but once you add onboarding, integration, training, a Zapier or Make subscription, and the hours your team spends setting it up, the real year-one cost is four or five times the quoted number. That's not necessarily bad. It's only bad when you didn't budget for it.

Metered pricing deserves its own attention. Tools that charge per query, per generation, or per token can create unpredictable bills. I've seen clients get a $3,200 bill in a month they didn't see coming. Ask for a ceiling, or build your own alerting.

Implementation hours from your side is the variable most buyers underestimate by half. A vendor will say "onboarding takes about 2 hours," and that might be true for their team, but your team is going to spend 20 to 40 hours the first month pulling data, mapping workflows, training users, and fixing the inevitable edge cases. Multiply the hourly cost of your team, and you have a number that often dwarfs the license fee in year one. Better to know that going in than to discover it in week three.

What should I ask about the team behind the tool?

The people shipping the tool matter more than the tool itself, especially in AI, where everything changes every six months. The four team questions tell you whether this vendor survives the next cycle.

  1. Who's the founder, and are they still involved day to day?
  2. How large is the engineering team, and where are they based?
  3. What's your roadmap for the next 12 months?
  4. Can you share three customer references who had problems and how you resolved them?

Question four is the one I love. Anyone can provide happy references. The real tell is whether a vendor can walk you through a rocky engagement that ended well. If they say "we've never had a customer struggle," they're either new or lying. Both are a problem.

Also pay attention to founder involvement. In early-stage AI, the founder is usually the one making the judgment calls that matter. If the founder has already moved on to "strategic" work and sales is being handled by someone three layers removed, the tool's soul may have left the building.

Roadmap questions are worth asking even if you suspect you'll get a canned answer. The goal isn't to hold the vendor to a promise, it's to learn whether they think about the next 12 months in terms of customer problems or in terms of features they can demo. A healthy roadmap reads like "we're solving X for customer segment Y by Q3." A worrying roadmap reads like "we're adding more integrations, better UI, and AI capabilities." The second one means they don't know what they're building, they're just adding to the pile.

What exit questions should I ask before signing?

Plan your exit before you walk in. The five exit questions sound defensive, but they're how you stay in control of the relationship.

  1. What's the minimum contract length, and what are the cancellation terms?
  2. What does offboarding look like in practice?
  3. Do you charge anything on the way out?
  4. Is there a non-compete or data clause that restricts me after cancellation?
  5. What does your average customer do 24 months in, renew, expand, or leave?

Month-to-month is usually worth paying a small premium for. Annual contracts lock you in during a season of AI where the right choice this month might be the wrong one in six. Unless the vendor is discounting the annual deal by 25 to 40 percent, hold the line on monthly.

The last question is the sleeper. A vendor who can tell you honestly that "about 30 percent of our customers don't renew, usually because they outgrow us or fold the capability into a broader tool" is being real with you. A vendor who swears nobody ever leaves has either a short memory or a tight NDA culture.

Question two, about offboarding, is where professional vendors separate from amateur ones. A professional offboarding process looks like this: a 30-day notice period, a clean data export in a standard format, a shutdown confirmation in writing, and a named human you can email if something breaks. An amateur offboarding process looks like a support ticket that takes two weeks to answer and a CSV that's missing half your fields. Ask for their offboarding documentation before you sign. If they don't have any, that tells you they haven't thought about it, which means you're going to be the test case.

How do I use this checklist in a real vendor meeting?

You don't run 23 questions in a single call. You pick the five or six that map to your biggest risk, and you ask them first. If the vendor flinches on any of them, the rest of the meeting is a courtesy.

When I'm sitting next to a client in a vendor pitch, I usually open with the outcome questions, then jump straight to the exit questions. It's a deliberate sequence. Outcomes tell me if there's a real business case. Exit tells me if the vendor respects my client's autonomy. Everything else, data, cost, team, sits in the middle and gets worked through before anything gets signed.

If the vendor is solid, they'll thank you for being thorough. I've had vendors tell me on the way out, "finally, someone who knows what to ask." That's a signal. It means the vendor is used to selling to people who don't push, and they respect people who do.

Buying AI well isn't about being an AI expert. It's about being a disciplined buyer. If you'd like a second set of eyes on a vendor you're evaluating, or you want help putting these questions into your specific context, I'm happy to walk through it with you on a short discovery call, or you can start with a free audit of your current stack to see what's actually earning its keep.

Common questions

Frequently asked

How long should a typical AI vendor evaluation take?

Budget two to four weeks for a real evaluation. That gives you time for two vendor calls, a reference conversation, a pricing review, and a short internal alignment meeting. Rushing past two weeks usually means you're being pushed, not led.

Should I ask for a proof of concept before buying?

Yes, and insist on one tied to your data and a defined success metric. A good POC runs 14 to 30 days, uses a real workflow, and ends with a go or no-go decision against a number you both agreed to up front.

What's a reasonable AI budget for a small business?

Most small businesses I work with end up spending between $500 and $5,000 per month on AI tools combined. The right number depends on what you're trying to do, not what the average is. Start with one workflow and expand only when it's paying for itself.

Can I negotiate AI vendor contracts, or are prices fixed?

Almost everything is negotiable, especially with newer vendors. Discounts of 10 to 25 percent on annual commits are common, and implementation fees are often waived for serious buyers. If a vendor says pricing is fixed, test it by walking away politely.

What if the vendor won't answer some of these questions?

That's useful information. A vendor who won't answer questions about data ownership, exit terms, or pricing transparency is telling you how they'll behave after you sign. Take the signal and keep looking.

Do I need a lawyer to review an AI vendor contract?

For anything over $10,000 a year or anything that touches customer data, yes. A short contract review runs $500 to $1,500 and can save you from terms that quietly hurt you later. For smaller month-to-month tools, a careful read by you is usually enough.