Data Science, ML and Analytics Engineering

How to Become an AI-First Specialist Right Now

For the past couple of years, I’ve been working as a Data Science and Data Analytics consultant. My clients are companies and startups from different countries around the world.

Today I’m convinced: no matter what your profession is — lawyer, recruiter, product manager, or designer — a mandatory requirement for working in 2025 is that you must be AI-first and 100% integrate various AI tools and approaches into your work processes.

Why is this critical right now?

Over the past year, I’ve increased my productivity by 3-4 times thanks to the proper use of AI tools. What used to take several days of research, I now do in a day. Projects that required a team of 2-3 people, I now execute alone.

In this post, I’ll share my top AI tools that I use every day, plus additional tools for specific tasks.

NotebookLM: My Personal Researcher

What it is: Google NotebookLM is a tool that transforms any documents into an interactive knowledge base. You upload files, and the AI creates podcasts from them, compiles study cards, and answers questions about the content. Recently, a mobile app has also been released.

How I use it every day:

  • Analyzing new papers – Helps me quickly review the content of all modern research to select those worth studying more deeply.
  • Quick study of YouTube videos – I don’t like watching and listening to lengthy presentations at thematic conferences. So I upload videos to NotebookLM and read the essence.
  • Book summaries – I create convenient book summaries in NotebookLM.

Study Guides automatically creates:

  • Timeline of key events
  • FAQ on the book’s topic
  • Mind maps of connections between concepts
  • Comparative analysis tables

Concrete example of results:

Recently I had a very large 50-page contract in English, and it was very difficult to navigate it to find important parts of the agreement: detailed provisions on payments, reimbursable expenses, insurance, and so on. NotebookLM greatly simplified my navigation through the document.

The main advantage of NotebookLM is that it doesn’t hallucinate, doesn’t make up facts, but relies on the material you provide to it.

Here are examples of good prompts for NotebookLM:

Finding contradictions:

“Compare conclusions from different sources – where do they diverge and why?” “Identify conflicting data between studies and explain possible reasons”

Trend analysis:

“What patterns are repeated in most sources?” “Identify the 3 most significant trends and their potential impact”

Strategic analysis:

“Build a SWOT matrix using all collected data” “Conduct a gap analysis between current state and desired results”

Prompts for comprehensive research:

Hidden insights:

“Highlight 5 unexpected patterns that are not obvious on the surface” “Find patterns that may be non-obvious with superficial analysis”

Data gaps:

“Identify critical shortcomings in the research and ways to eliminate them” “What important aspects remained unstudied?”

Practical recommendations:

“Formulate an action plan with task prioritization using the Jobs-to-be-Done model” “Create an implementation roadmap indicating the urgency and importance of each action”

These formulations will help you get more structured and practical data analysis.

NotebookLM limitations: Works only with documents up to 500 pages each, maximum 50 sources. Sometimes can be slow when processing large files.

Cursor: VibeCoding

What it is: Cursor is an IDE (development environment) that works like your personal programmer. You describe what needs to be done, and the AI writes code directly in the editor.

My path to Cursor: I can program, but I’m very weak with backend and frontend, and here Cursor is an excellent assistant for creating my own projects and services.

How I use it every day: Creating and improving web services Developing Telegram bots Integration with APIs of various platforms Automating routine tasks through scripts

Concrete example: Over the past month, I created with Cursor’s help:

  1. Telegram bot for fantasy football league analytics and team schedule difficulty output
  2. Demo web service for A/B testing email campaigns
  3. Medical bot for collecting patient medical history

Advanced techniques for working with Cursor: Iterative development: I start with a minimal version, then step by step add functionality Contextual comments: I describe business logic in comments, Cursor better understands the task Using @-commands: @docs for referring to documentation, @codebase for working with existing code

Tip: Don’t hesitate to describe the task in business language. Cursor perfectly understands technical requirements in Russian. Instead of “create pandas DataFrame with group_by method” just say “group data by clients and calculate average order value”.

Cursor limitations: Requires basic understanding of programming for debugging. Subscription costs $20/month for Pro version. Sometimes generates excessive code that needs optimization.

Claude: My Universal Assistant

What it is: Claude is an AI assistant from Anthropic that can support conversation on any topic, write text, analyze data, help with problem solving.

Why Claude specifically: I’ve worked with ChatGPT, Gemini and dozens of other models. Each has its strengths, but Claude stands out for its analytical thinking ability and work with complex data. It better understands the context of business tasks, hallucinates less on technical questions and gives more structured answers. And most importantly, it works better with code and business requirements for code.

My main use cases for Claude: Technical documentation: Creating technical requirements, architectural solutions Data processing: CSV file analysis, creating interactive charts Communication: Composing emails, presentations, client proposals Learning and mentoring: Explaining complex concepts, case analysis

Specific usage metrics: Time for client presentation preparation reduced from 6 hours to 2-3 hours Quality of technical documentation increased by 40% (according to feedback) Data analysis speed increased 2x

My current LLM workflow for code generation:

Briefly: Brainstorming and specification development, then plan planning, and then execution through code generation via LLM or Cursor

Step 1: Refining the idea Use conversational LLM to clarify the idea:

“Ask me one question at a time so we can develop a detailed, step-by-step specification for this idea. Here’s the idea: <IDEA>”

Step 2: Planning

“Create a detailed step-by-step implementation plan for this project. Once you have a solid plan, break it down into small, iterative blocks that logically follow each other. <SPEC>”

Step 3: Implementation We use Cursor or prompts from step 2.

Repomix – My Secret Helper for Working with Others’ Code

What it is: Repomix is a CLI tool that packages an entire repository into one file, preparing it for analysis by AI models. It takes the entire project structure, code, README, documentation and creates a unified context.

Why this is needed: When I need to quickly understand someone else’s project or help a client with existing codebase, the usual approach takes hours. You have to study architecture, read documentation, understand dependencies, look for entry points.

My workflow with Repomix:

  1. Quick analysis of client projects:
npx repomix path/to/client-project --output analysis.txt

I upload the resulting file to Claude and get: Architectural overview in 5 minutes Identification of potential problems Optimization recommendations Development/refactoring plan

  1. Onboarding to new projects: Instead of a week of studying legacy codebase, I get understanding of architecture in a couple of hours.
  2. Code review automation: I generate checklists of potential problems and improvements for the development team.

Advanced usage techniques:

Comparative analysis:

repomix old-version/ --output old.txt repomix new-version/ --output new.txt

Then Claude compares versions and produces a changelog with impact analysis of changes.

Documentation generation: Repo → Repomix → Claude → automatic technical documentation with architecture diagrams.

Security audit: I upload packaged code and ask Claude to find potential vulnerabilities, hardcoded secrets, insecure practices.

Repomix advantages: Speed: instant overview of any project Context completeness: AI sees the entire project as a whole, not fragments Universality: works with any programming languages Simplicity: one command and it’s ready

Disadvantages and limitations: File size: large projects may exceed AI model limits Confidentiality: need to be careful with proprietary code Analysis quality: depends on how well the source code is written Doesn’t replace deep dive: for serious changes, detailed study is still needed

Practical advice: Use .repomixignore file to exclude node_modules, build artifacts, tests – so AI can focus on business logic.

v0.app – Interface Generation with One Prompt

What it is: v0 from Vercel is an AI tool for generating React components and complete interfaces. You describe what you need in natural language and get ready code with modern design and full functionality.

Why this is a revolution for my work: Previously, creating a landing page or interface prototype took 1-2 days, even using ready templates. You had to think about layout, styling, responsive design, accessibility. Now from idea to working prototype takes 15-30 minutes.

My main usage scenarios:

  1. Quick MVPs for clients: Prompt: “Create landing for this product. Use Presentation” Result: Fully ready responsive landing with modern design, animations, correct colors and typography.

I also used it in my small project fpltools

v0.app - Interface Generation with One Prompt

Over the past 3 months, I created with v0’s help: 3 landing pages for clients and my own projects

Workflow for v0 + Cursor integration:

  1. Generate basic structure in v0
  2. Export code to Cursor
  3. Develop business logic and integrations
  4. Deploy through Vercel

Practical prompts for v0:

For landing page:

“Create modern landing page for [product type]. Include hero with gradient background, features section with icons, testimonials carousel, pricing table with recommended plan highlight, FAQ accordion, footer with social links. Use color scheme [colors]. Style: minimalistic/tech/corporate”

For dashboards:

“Make analytics dashboard for [business type]. Need: sidebar navigation, top metrics cards, revenue chart, recent activity table, user growth graph, notifications panel. Dark theme, responsive design”

For forms:

“Create multi-step registration form: personal info → company details → plan selection → payment. Include progress indicator, validation, smooth transitions between steps”

v0 advantages:

  • Speed: from idea to prototype in minutes
  • Code quality: clean, modern React/Next.js code
  • Modern design: automatically follows current design trends
  • Responsive: all components adaptive out of the box
  • Customization: easy to modify for specific needs

Limitations:

  • Subscription: $20/month for full functionality
  • Dependency on Vercel ecosystem: works best with their stack
  • Limited complexity: complex interactive elements may require refinement
  • Generation not always predictable: sometimes need several iterations for desired result

Real Results of AI-First Approach

Over the past 12 months, thanks to AI tools integration:

Operational metrics:

  • Research time reduced by 2x
  • MVP creation speed increased by 2-3x
  • Documentation quality increased (subjective assessment) by 50%

Practical Tips for Beginners

  1. Start with solving your most painful problem Don’t try to master everything at once. If you constantly study new subject areas – try NotebookLM. If you have many routine tasks – start with Claude for automation.
  2. Treat AI like a junior with superpowers Don’t expect perfect results from the first try, but remember – this junior works 24/7, knows all programming languages and reads at the speed of light. Clarify context, ask to redo, explain business requirements.
  3. Create a prompt library for typical tasks When you find a formulation that gives excellent results, save it as a template. I have proven prompts for A/B test analysis, explaining ML model results to business, and creating client proposals.
  4. Go beyond obvious applications AI can do much more than it seems. For example: try uploading meeting recordings to NotebookLM – you’ll get structured insights and action items.
  5. Measure results Track time before and after implementing each tool. This will help understand ROI and optimize processes.
  6. Build AI-first workflows The biggest magic happens when you use multiple AI tools as a unified system. NotebookLM for quick domain immersion → Claude for requirements analysis and architecture planning → Cursor for technical solution implementation → Claude again for translating results into business conclusions.

Tools That Didn’t Stick: Honest Experience

n8n – beautiful idea, complex implementation

n8n is positioned as a no-code platform for workflow automation – a visual editor where you can connect various services and APIs without programming. Sounds perfect for my data integration and process automation tasks.

Why I tried it: I wanted to automate a client data collection pipeline: Google Sheets → processing → sending reports to Telegram/Email. It seemed like n8n was perfect for this.

n8n - beautiful idea, complex implementation

What went wrong:

Steep learning curve: Despite “no-code” positioning, for effective work you need to understand REST API, webhooks, data formats. Paradox: if you understand this, it’s easier to write code.

Debugging nightmare: When workflow breaks (and it breaks often), finding the problem is extremely difficult. No proper logs, error messages are uninformative. Spent 4 hours debugging what gets fixed in code in 15 minutes.

Performance issues: When processing large data volumes (10k+ records) workflows start slowing down and crashing. Self-hosted version requires serious resources.

Vendor lock-in: Business process logic is locked in n8n visual schemas. Migration to another solution = rewriting everything from scratch.

Final verdict: After 2 weeks of experiments, returned to Python scripts + cron jobs + simple bash scripts combination. Turned out faster, more reliable and much more flexible.

Lesson: No-code solutions work great for simple, standard scenarios. But as soon as specific requirements appear, traditional programming (especially with Cursor’s help) proves more effective.

Pitfalls and How to Avoid Them

Over-reliance on AI: Always critically verify important facts and decisions. AI is a tool for augmentation, not replacement of your thinking.

Ignoring privacy: Don’t upload confidential client data to public AI services without their consent.

Lack of versioning: Save prompts and processes that work. AI models get updated, and results can change.

Perfectionism: Don’t wait for perfect results from the first attempt. Iterate and improve processes gradually.

Brief Conclusions

Don’t wait for the “perfect moment.” The best way to understand AI capabilities is to start using it right now. Start with simple tasks, gradually increase process complexity and you’ll see how your professional value reaches a new level.

P.S. If you have questions about implementing AI tools in your field or want to share your experience – write in the comments. Always happy to discuss practical cases and share insights.

Share it

If you liked the article - subscribe to my channel in the telegram https://t.me/renat_alimbekov


Other entries in this category: