Request a Review
Case Study · Practical

7 Mistakes Found in 30 Minutes:
A Real Mini Audit of an AI-Translated SaaS Onboarding Flow

A North American project management SaaS. Eight months live in Japan. Trial-to-paid conversion roughly half the English number. Here is what a 30-minute onboarding audit found — and why none of it would have been caught by a bilingual proofread.

Munehiro Hiraki
Munehiro Hiraki
Japanese Localization QA Specialist · 10+ Years FinTech
May 2026 Case Study ⏱ 6 min read

Most SaaS teams treat Japanese onboarding as a translation problem. Run the English copy through DeepL or GPT-4, have someone "check it," ship it. The flow looks fine in QA. The Japanese trial-to-paid conversion stays 40–60% lower than English, and nobody can quite say why.

The reason is rarely a single catastrophic mistranslation. It is a pattern of small, polite-sounding choices that quietly erode trust over the first 30 minutes a Japanese user spends in your product. None of them are "wrong" by dictionary standards. All of them tell a Japanese decision-maker that this product was not built for them.

This is a walkthrough of an actual 30-minute Mini Audit on a North American project management SaaS, anonymized but otherwise unchanged. Seven issues surfaced before the timer ran out. Each one is the kind of thing AI translation produces by default, and human reviewers without localization training routinely approve.

30min
Audit time box
80
Unique strings reviewed
7
Issues surfaced
~9%
Defect rate in onboarding

The Setup: A North American Project Management SaaS

The product is a mid-market project management tool. Series B, around 200 employees, strong English-speaking customer base across North America and parts of Europe. The Japanese version had been live for about eight months. Translation pipeline: source strings exported from their CMS, machine translated through a leading neural MT system, reviewed by a bilingual contractor, deployed.

The team's question was simple: "Our Japanese signups are healthy, but trial-to-paid is roughly half of our English number. Is the product the problem, or is it the copy?"

We scoped a Mini Audit on the onboarding flow only. Account creation through first project setup. Five screens. Roughly 80 unique strings. Thirty-minute time box. The goal was not a complete report — it was to surface enough evidence to know whether a deeper audit was justified.

Seven issues surfaced in 30 minutes. Here they are in the order we found them.

The 7 Issues Discovered

1
Issue #1 — Welcome Screen
Imperative voice where invitation is expected
❌ AI Output
プロジェクトを作成してください。
✅ Corrected
さっそくプロジェクトを作ってみましょう。
The original English read "Create your first project." Neutral, friendly, action-oriented. The MT output rendered it as a polite imperative — grammatically correct, tonally wrong. Japanese onboarding copy at this moment is supposed to invite, not instruct. The 〜ましょう form signals "let's do this together," which is exactly what a first-time user needs from a product they don't yet trust.
High Impact
2
Issue #2 — Integrations Setup Screen
"Skip" button: literal translation strips the meaning
❌ AI Output
スキップ
✅ Corrected
あとで設定する
The "Skip" button had been left as katakana. Functionally readable, but it forces the user to compute: skip what, exactly? Skip forever? Skip and lose data? 「あとで設定する」 (set this up later) tells the user precisely what is happening — the configuration is deferred, not abandoned. Users who hesitate at this button either backtrack (slowing setup) or click and regret it (creating support tickets later).
High Impact
3
Issue #3 — Empty State
Technically translated, emotionally absent
❌ AI Output
プロジェクトがありません。
✅ Corrected
まだプロジェクトはありません。最初の1つを作ってみましょう。
The English empty state read: "No projects yet. Create your first one to get started." The MT output gave back the first sentence accurately and dropped the second half entirely — a known failure mode where MT systems collapse two-sentence patterns into one when the second sentence is heavily contextual. The result: a flat, almost passive-aggressive statement. The user lands on an empty screen and gets told there's nothing here. No invitation, no next step. Empty states are conversion-critical surfaces — losing the CTA half is a measurable drop in first-action completion.
High Impact
4
Issue #4 — Feature Tooltip
Keigo register mismatch and excessive length
❌ AI Output (47 chars)
こちらの機能をご利用いただくことで、プロジェクトの進捗状況をリアルタイムでご確認いただけるようになります。
✅ Corrected (17 chars)
プロジェクトの進捗をリアルタイムで確認できます。
A tooltip on a small UI icon had been rendered in full keigo (sonkeigo + kenjōgo combined) and ran 47 characters. Tooltips need to be scannable in under two seconds. The over-polite phrasing was a default MT tendency: when in doubt, escalate the politeness level. The result reads as nervous, almost apologetic, for a feature description. Users skip overlong tooltips — the information they were supposed to convey doesn't land, increasing the chance of mis-clicks and confused first sessions.
Medium Impact
5
Issue #5 — Error Messages (×5)
「お客様」 overuse in inline UI errors
❌ AI Output
お客様のメールアドレスは既に登録されております。
✅ Corrected
このメールアドレスはすでに登録されています。
Five different error messages in the onboarding flow began with 「お客様」. In Japanese B2B SaaS, this register reads as customer-service desk language, not product UI language. Modern Japanese SaaS — Money Forward, Sansan, freee — almost never use 「お客様」 in inline UI errors. They address the situation, not the person. Each 「お客様」 in an error message reinforces the impression that the product was localized by someone who has never used a Japanese SaaS product as their daily driver.
High Impact
6
Issue #6 — Forward Button (All 5 Screens)
「次へ」vs「進む」vs「続ける」: inconsistent across screens
❌ AI Output — 3 different terms used
Screen 1: 次へ
Screen 2: 続ける
Screen 3: 進む
Screen 4: 次へ
Screen 5: 続ける
✅ Corrected — Consistent across all screens
Screen 1–5: 次へ
(「続ける」は再開フロー向け。「進む」は主要CTAに不向き)
Each term is acceptable in isolation. Together, they signal that no one owned the terminology. The English source consistently used "Continue" — the Japanese should consistently use one term across the flow. Recommendation: 「次へ」 for sequential setup steps; 「続ける」 reads better for resumption flows (returning to an interrupted session); 「進む」 is rarely the right choice for primary CTAs. Inconsistency at this scale makes users mildly uncertain whether they are in the same flow or have been redirected — uncertainty that compounds across screens.
Medium Impact
7
Issue #7 — Destructive Action Dialog
Confirmation dialog: trust construction at the wrong moment
❌ AI Output
本当に削除してもよろしいですか?
✅ Corrected
このプロジェクトを削除します。元に戻せません。続けますか?
The destructive-action confirmation read as a generic "are you sure?" The English source was more informative: "Delete this project? This action can't be undone." The MT output dropped the consequence statement and defaulted to a phrase that sounds, in Japanese, like a polite checkbox rather than a serious warning. In a destructive-action dialog, Japanese users specifically need to see what will happen and whether it is reversible, in that order. Politeness is fine; vagueness is not. Result: either accidental deletions and the support load that follows, or excessive hesitation that makes the product feel risky to use.
High Impact

"None of these seven issues are mistranslations in the strict sense. They would all pass a bilingual proofread. They fail a different test: whether a Japanese decision-maker walks away thinking this is a product they can build their workflow around."

The Business Impact: Trial-to-Paid Conversion at Stake

Seven issues across 80 strings. About a 9% defect rate. None of them would have been flagged by a standard bilingual review, because none of them are grammatically incorrect. They fail a different test: whether a Japanese decision-maker, evaluating this product against a domestic competitor, walks away thinking "this feels like a product I can build my workflow around."

The honest answer for the company we audited: probably not, on first session. Onboarding is the surface where trust gets established or lost. A 9% defect rate concentrated in the highest-stakes 30 minutes of the customer relationship is a conversion problem, not a translation problem.

The pattern across our audits is consistent. Companies that fix onboarding-flow localization see trial-to-paid lift in the Japanese segment that closes a meaningful portion of the gap with English. Not all of it — localization is one variable among many. But it is the variable most under-invested in relative to its impact.

The Core Finding

"The Japanese is grammatically correct" and "the Japanese will convert" are different statements about different things. A bilingual review answers the first question. A localization QA audit answers the second.

The Fix Process: 30 Minutes of Structured Diagnosis

The audit itself was structured, not improvisational. Here is the actual sequence used across those 30 minutes:

1
Minutes 0–10 · User Flow Walk
Walk the entire onboarding flow as a new Japanese user, taking screenshots of every screen and noting first-impression reactions. Issues 1, 3, and 7 surfaced here — they are the kind of problems that only show up when you experience the flow rather than read the strings in a spreadsheet.
2
Minutes 10–20 · String-Level Pass
Compare the Japanese against the English source string by string, flagging register, terminology, and length issues. Issues 2, 4, and 5 came from this pass — they require comparison context to surface.
3
Minutes 20–30 · Consistency Check
Cross-screen audit of terminology, button labels, and tone progression. Issue 6 came from this pass — it is invisible until you look at all five screens side by side.

Thirty minutes is enough time to surface a representative sample of issues. It is not enough time to fix them, write a full report, or audit the full product. It is enough to answer the question: "Is there a problem worth investigating?" In this case, the answer was clearly yes.

Key Takeaways for SaaS PMs

If you are running a Japanese version of your product and your trial-to-paid conversion in Japan trails your English number by more than 20%, localization is a plausible contributor. The seven categories to check first:

  • Invitation tone — Does onboarding copy invite or instruct?
  • Button literalness — Do action labels describe what actually happens, or just translate the English word?
  • Empty-state completeness — Are both sentences of empty-state copy present, including the CTA?
  • Tooltip register — Is tooltip language appropriately concise and informal for a UI context?
  • Error-message register — Does error copy address the situation, or over-formalize with 「お客様」?
  • Terminology consistency — Is the same action called the same thing across all screens?
  • Destructive-action clarity — Do confirmation dialogs state the consequence and its reversibility?

Most teams will not find zero issues. Most teams will find six to ten in their onboarding flow alone. The good news is that this category of issue is fixable in days, not months, once it has been correctly diagnosed.

Munehiro Hiraki — Japanese Localization QA Specialist
Munehiro Hiraki
Japanese Localization QA Specialist · 10+ Years FinTech · 15+ Years Translation · 20+ Years International Experience

Munehiro Hiraki is a native Japanese specialist with deep expertise in Japanese localization QA for SaaS, FinTech, and AI companies entering the Japanese market. He has reviewed 50+ pages across FinTech, B2B SaaS, and AI products, identifying over 1,200 localization issues.

More Insights

More from the Hiraki Localization Blog

SaaS Japan

10 Japanese Localization Mistakes That Cost SaaS Companies Japan Market Share

The most damaging errors across UI, help center, onboarding emails, and CTA copy — with before/after fixes for each.

SaaS Japan

How to Launch Your SaaS in Japan Without Losing Trust on Day One

The complete localization checklist — homepage, pricing, checkout, legal, help center — prioritized by commercial impact.

Localization Strategy

The Hidden Cost of "Good Enough" Japanese

Three hidden cost layers — conversion, sales cycle, and churn — and a framework for budgeting Japanese QA as a commercial asset.

Get a Written Report on Your Japanese Onboarding

Hiraki Localization runs 30-minute Mini Audits on Japanese SaaS onboarding flows. Written report, before/after recommendations, prioritized fix list. Most audits surface 5–12 issues in the onboarding flow alone. From $490, delivered in 3–5 business days.