Back to skills
SkillHub ClubAnalyze Data & AIData / AI

moai-lang-scala

Scala 3.4+ development specialist covering Akka, Cats Effect, ZIO, and Spark patterns. Use when building distributed systems, big data pipelines, or functional programming applications.

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
2
Hot score
79
Updated
March 20, 2026
Overall rating
C1.6
Composite score
1.6
Best-practice grade
B77.6

Install command

npx @skill-hub/cli install junseokandylee-rallyapp-moai-lang-scala
scalaakkacats-effectziosparkfunctional-programming

Repository

junseokandylee/RallyApp

Skill path: .claude/skills/moai-lang-scala

Scala 3.4+ development specialist covering Akka, Cats Effect, ZIO, and Spark patterns. Use when building distributed systems, big data pipelines, or functional programming applications.

Open repository

Best for

Primary workflow: Analyze Data & AI.

Technical facets: Data / AI.

Target audience: everyone.

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: junseokandylee.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install moai-lang-scala into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/junseokandylee/RallyApp before adding moai-lang-scala to shared team environments
  • Use moai-lang-scala for language workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

Original source / Raw SKILL.md

---
name: moai-lang-scala
description: Scala 3.4+ development specialist covering Akka, Cats Effect, ZIO, and Spark patterns. Use when building distributed systems, big data pipelines, or functional programming applications.
version: 1.0.0
category: language
tags:
  - scala
  - akka
  - cats-effect
  - zio
  - spark
  - functional-programming
context7-libraries:
  - /akka/akka
  - /typelevel/cats-effect
  - /zio/zio
  - /apache/spark
related-skills:
  - moai-lang-java
  - moai-domain-database
updated: 2025-12-07
status: active
---

## Quick Reference (30 seconds)

Scala 3.4+ Development Specialist - Functional programming, effect systems, and big data.

Auto-Triggers: Scala files (`.scala`, `.sc`), build files (`build.sbt`, `project/build.properties`)

Core Capabilities:
- Scala 3.4: Given/using, extension methods, enums, opaque types, match types
- Akka 2.9: Typed actors, streams, clustering, persistence
- Cats Effect 3.5: Pure FP runtime, fibers, concurrent structures
- ZIO 2.1: Effect system, layers, streaming, error handling
- Apache Spark 3.5: DataFrame API, SQL, structured streaming
- Testing: ScalaTest, Specs2, MUnit, Weaver

Key Ecosystem Libraries:
- HTTP: Http4s 0.24, Tapir 1.10
- JSON: Circe 0.15, ZIO JSON 0.6
- Database: Doobie 1.0, Slick 3.5, Quill 4.8
- Streaming: FS2 3.10, ZIO Streams 2.1

---

## Implementation Guide (5 minutes)

### Scala 3.4 Core Features

Extension Methods:
```scala
extension (s: String)
  def words: List[String] = s.split("\\s+").toList
  def truncate(maxLen: Int): String =
    if s.length <= maxLen then s else s.take(maxLen - 3) + "..."
  def isBlank: Boolean = s.trim.isEmpty

extension [A](list: List[A])
  def second: Option[A] = list.drop(1).headOption
  def penultimate: Option[A] = list.dropRight(1).lastOption
```

Given and Using (Context Parameters):
```scala
trait JsonEncoder[A]:
  def encode(value: A): String

given JsonEncoder[String] with
  def encode(value: String): String = s"\"$value\""

given JsonEncoder[Int] with
  def encode(value: Int): String = value.toString

given [A](using encoder: JsonEncoder[A]): JsonEncoder[List[A]] with
  def encode(value: List[A]): String =
    value.map(encoder.encode).mkString("[", ",", "]")

def toJson[A](value: A)(using encoder: JsonEncoder[A]): String =
  encoder.encode(value)

// Usage
val json = toJson(List(1, 2, 3)) // "[1,2,3]"
```

Enum Types and ADTs:
```scala
enum Color(val hex: String):
  case Red extends Color("#FF0000")
  case Green extends Color("#00FF00")
  case Blue extends Color("#0000FF")
  case Custom(override val hex: String) extends Color(hex)

enum Result[+E, +A]:
  case Success(value: A)
  case Failure(error: E)

  def map[B](f: A => B): Result[E, B] = this match
    case Success(a) => Success(f(a))
    case Failure(e) => Failure(e)

  def flatMap[E2 >: E, B](f: A => Result[E2, B]): Result[E2, B] = this match
    case Success(a) => f(a)
    case Failure(e) => Failure(e)
```

Opaque Types:
```scala
object UserId:
  opaque type UserId = Long
  def apply(id: Long): UserId = id
  def fromString(s: String): Option[UserId] = s.toLongOption
  extension (id: UserId)
    def value: Long = id
    def asString: String = id.toString

export UserId.UserId

object Email:
  opaque type Email = String
  def apply(email: String): Either[String, Email] =
    if email.contains("@") && email.contains(".") then Right(email)
    else Left(s"Invalid email: $email")
  extension (email: Email)
    def value: String = email
    def domain: String = email.split("@").last
```

Union and Intersection Types:
```scala
// Union types
type StringOrInt = String | Int

def describe(value: StringOrInt): String = value match
  case s: String => s"String: $s"
  case i: Int => s"Int: $i"

// Intersection types
trait HasName:
  def name: String

trait HasAge:
  def age: Int

type Person = HasName & HasAge

def greet(person: Person): String =
  s"Hello ${person.name}, age ${person.age}"
```

### Cats Effect 3.5

Basic IO Operations:
```scala
import cats.effect.*
import cats.syntax.all.*

def program: IO[Unit] =
  for
    _ <- IO.println("Enter your name:")
    name <- IO.readLine
    _ <- IO.println(s"Hello, $name!")
  yield ()

// Resource management
def withFile[A](path: String)(use: BufferedReader => IO[A]): IO[A] =
  Resource
    .make(IO(new BufferedReader(new FileReader(path))))(r => IO(r.close()))
    .use(use)

// Error handling
def fetchUser(id: Long): IO[User] =
  IO.fromOption(repository.findById(id))(UserNotFound(id))
    .handleErrorWith {
      case _: UserNotFound => IO.raiseError(new Exception(s"User $id not found"))
    }
```

Concurrent Programming:
```scala
import cats.effect.std.*

// Parallel execution
def fetchUserData(userId: Long): IO[UserData] =
  (fetchUser(userId), fetchOrders(userId), fetchPreferences(userId))
    .parMapN(UserData.apply)

// Fibers for background processing
def processInBackground(task: IO[Unit]): IO[Unit] =
  task.start.flatMap(fiber =>
    IO.println("Task started") *> fiber.join.void
  )

// Semaphore for rate limiting
def rateLimitedRequests[A](tasks: List[IO[A]], max: Int): IO[List[A]] =
  Semaphore[IO](max).flatMap { sem =>
    tasks.parTraverse(task => sem.permit.use(_ => task))
  }

// Ref for shared state
def counter: IO[Ref[IO, Int]] = Ref.of[IO, Int](0)
def increment(ref: Ref[IO, Int]): IO[Int] = ref.updateAndGet(_ + 1)
```

Streaming with FS2:
```scala
import fs2.*
import fs2.io.file.*

def processLargeFile(path: Path): Stream[IO, String] =
  Files[IO].readUtf8Lines(path)
    .filter(_.nonEmpty)
    .map(_.toLowerCase)
    .evalTap(line => IO.println(s"Processing: $line"))

def writeResults(path: Path, lines: Stream[IO, String]): IO[Unit] =
  lines.intersperse("\n")
    .through(text.utf8.encode)
    .through(Files[IO].writeAll(path))
    .compile.drain
```

### ZIO 2.1

Basic ZIO Operations:
```scala
import zio.*

val program: ZIO[Any, Nothing, Unit] =
  for
    _ <- Console.printLine("Enter your name:")
    name <- Console.readLine
    _ <- Console.printLine(s"Hello, $name!")
  yield ()

def fetchUser(id: Long): ZIO[UserRepository, UserError, User] =
  for
    repo <- ZIO.service[UserRepository]
    user <- ZIO.fromOption(repo.findById(id)).orElseFail(UserNotFound(id))
  yield user

// Resource management
def withFile[A](path: String): ZIO[Scope, IOException, BufferedReader] =
  ZIO.acquireRelease(
    ZIO.attempt(new BufferedReader(new FileReader(path))).refineToOrDie[IOException]
  )(reader => ZIO.succeed(reader.close()))
```

ZIO Layers (Dependency Injection):
```scala
trait UserRepository:
  def findById(id: Long): Task[Option[User]]
  def save(user: User): Task[User]

trait EmailService:
  def sendEmail(to: String, subject: String, body: String): Task[Unit]

case class UserRepositoryLive(db: Database) extends UserRepository:
  def findById(id: Long): Task[Option[User]] =
    ZIO.attempt(db.query(s"SELECT * FROM users WHERE id = $id")).map(_.headOption)
  def save(user: User): Task[User] =
    ZIO.attempt(db.insert("users", user)).as(user)

object UserRepositoryLive:
  val layer: ZLayer[Database, Nothing, UserRepository] =
    ZLayer.fromFunction(UserRepositoryLive.apply)

// Composing layers
val appLayer = Database.layer >>> UserRepositoryLive.layer ++ EmailServiceLive.layer

object Main extends ZIOAppDefault:
  def run = program.provide(appLayer)
```

ZIO Streaming:
```scala
import zio.stream.*

def processEvents: ZStream[Any, Throwable, ProcessedEvent] =
  ZStream.fromQueue(eventQueue)
    .filter(_.isValid)
    .mapZIO(enrichEvent)
    .grouped(100)
    .mapZIO(batchProcess)
    .flattenIterables
```

### Akka Typed Actors

Actor Definition:
```scala
import akka.actor.typed.*
import akka.actor.typed.scaladsl.*

object UserActor:
  sealed trait Command
  case class GetUser(id: Long, replyTo: ActorRef[Option[User]]) extends Command
  case class CreateUser(request: CreateUserRequest, replyTo: ActorRef[User]) extends Command
  case class UpdateUser(id: Long, name: String, replyTo: ActorRef[Option[User]]) extends Command

  def apply(repository: UserRepository): Behavior[Command] =
    Behaviors.receiveMessage {
      case GetUser(id, replyTo) =>
        replyTo ! repository.findById(id)
        Behaviors.same
      case CreateUser(request, replyTo) =>
        replyTo ! repository.save(User.from(request))
        Behaviors.same
      case UpdateUser(id, name, replyTo) =>
        val updated = repository.findById(id).map(u => repository.save(u.copy(name = name)))
        replyTo ! updated
        Behaviors.same
    }
```

Akka Streams:
```scala
import akka.stream.*
import akka.stream.scaladsl.*

val source: Source[Int, NotUsed] = Source(1 to 1000)
val flow: Flow[Int, String, NotUsed] =
  Flow[Int].filter(_ % 2 == 0).map(_ * 2).map(_.toString)
val sink: Sink[String, Future[Done]] = Sink.foreach(println)

val graph = source.via(flow).toMat(sink)(Keep.right)

// Backpressure handling
val throttledSource = source
  .throttle(100, 1.second)
  .buffer(1000, OverflowStrategy.backpressure)
```

### Apache Spark 3.5

DataFrame Operations:
```scala
import org.apache.spark.sql.{DataFrame, SparkSession}
import org.apache.spark.sql.functions.*

val spark = SparkSession.builder()
  .appName("Data Analysis")
  .config("spark.sql.adaptive.enabled", "true")
  .getOrCreate()

import spark.implicits.*

val userMetrics = orders
  .groupBy("user_id")
  .agg(
    sum("amount").as("total_spent"),
    count("*").as("order_count"),
    avg("amount").as("avg_order_value")
  )
  .join(users, Seq("user_id"), "left")
  .withColumn("customer_tier",
    when(col("total_spent") > 10000, "platinum")
      .when(col("total_spent") > 1000, "gold")
      .otherwise("standard")
  )
```

Structured Streaming:
```scala
val streamingOrders = spark.readStream
  .format("kafka")
  .option("kafka.bootstrap.servers", "localhost:9092")
  .option("subscribe", "orders")
  .load()
  .selectExpr("CAST(value AS STRING)")
  .as[String]
  .map(parseOrder)

val aggregated = streamingOrders
  .withWatermark("timestamp", "10 minutes")
  .groupBy(window($"timestamp", "1 hour"), $"product_category")
  .agg(sum("amount").as("hourly_sales"))

aggregated.writeStream
  .format("delta")
  .outputMode("append")
  .option("checkpointLocation", "/checkpoints/sales")
  .start("/output/hourly-sales")
```

---

## Advanced Patterns

### Build Configuration (SBT 1.10)

```scala
ThisBuild / scalaVersion := "3.4.2"
ThisBuild / organization := "com.example"
ThisBuild / version := "1.0.0"

lazy val root = (project in file("."))
  .settings(
    name := "scala-service",
    libraryDependencies ++= Seq(
      "org.typelevel" %% "cats-effect" % "3.5.4",
      "org.typelevel" %% "cats-core" % "2.10.0",
      "co.fs2" %% "fs2-core" % "3.10.0",
      "dev.zio" %% "zio" % "2.1.0",
      "com.typesafe.akka" %% "akka-actor-typed" % "2.9.0",
      "org.http4s" %% "http4s-ember-server" % "0.24.0",
      "io.circe" %% "circe-generic" % "0.15.0",
      "org.tpolecat" %% "doobie-core" % "1.0.0-RC4",
      "org.scalatest" %% "scalatest" % "3.2.18" % Test,
      "org.typelevel" %% "munit-cats-effect" % "2.0.0" % Test
    ),
    scalacOptions ++= Seq("-deprecation", "-feature", "-Xfatal-warnings")
  )
```

### Testing Quick Reference

ScalaTest:
```scala
class UserServiceSpec extends AnyFlatSpec with Matchers:
  "UserService" should "create user successfully" in {
    val result = service.createUser(CreateUserRequest("John", "[email protected]"))
    result.name shouldBe "John"
  }
```

MUnit with Cats Effect:
```scala
class UserServiceSuite extends CatsEffectSuite:
  test("should fetch user") {
    UserService.findById(1L).map { result =>
      assertEquals(result.name, "John")
    }
  }
```

ZIO Test:
```scala
object UserServiceSpec extends ZIOSpecDefault:
  def spec = suite("UserService")(
    test("should find user") {
      for result <- UserService.findById(1L)
      yield assertTrue(result.name == "John")
    }
  )
```

---

## Context7 Integration

Library mappings for latest documentation:
- `/scala/scala3` - Scala 3.4 language reference
- `/typelevel/cats-effect` - Cats Effect 3.5 documentation
- `/zio/zio` - ZIO 2.1 documentation
- `/akka/akka` - Akka 2.9 typed actors and streams
- `/http4s/http4s` - Functional HTTP server/client
- `/apache/spark` - Spark 3.5 DataFrame and SQL
- `/circe/circe` - JSON library
- `/slick/slick` - Database access

---

## Troubleshooting

Common Issues:
- Implicit resolution: Use `scalac -explain` for detailed error messages
- Type inference: Add explicit type annotations when inference fails
- SBT slow compilation: Enable `Global / concurrentRestrictions` in build.sbt

Effect System Issues:
- Cats Effect: Check for missing `import cats.effect.*` or `import cats.syntax.all.*`
- ZIO: Verify layer composition with `ZIO.serviceWith` and `ZIO.serviceWithZIO`
- Akka: Review actor hierarchy and supervision strategies

---

## Works Well With

- `moai-lang-java` - JVM interoperability, Spring Boot integration
- `moai-domain-backend` - REST API, GraphQL, microservices patterns
- `moai-domain-database` - Doobie, Slick, database patterns
- `moai-quality-testing` - ScalaTest, MUnit, property-based testing
- `moai-infra-kubernetes` - Scala application deployment

---

## Advanced Documentation

For comprehensive reference materials:
- [reference.md](reference.md) - Complete Scala 3.4 coverage, Context7 mappings, performance
- [examples.md](examples.md) - Production-ready code: Http4s, Akka, Spark patterns

---

Last Updated: 2025-12-07
Status: Production Ready (v1.0.0)


---

## Referenced Files

> The following files are referenced in this skill and included for context.

### reference.md

```markdown
# Scala 3.4+ Reference Guide

## Complete Language Coverage

### Scala 3.4 (November 2025)

Version Information:
- Latest: 3.4.2
- Dotty: New compiler with improved type system
- TASTy: Portable intermediate representation
- JVM Target: 11, 17, 21 (recommended: 21)

Core Features:

- Export Clauses: Selective member export from composed objects
- Extension Methods: Type-safe extensions without implicit classes
- Enum Types: Algebraic data types with exhaustive pattern matching
- Opaque Types: Zero-cost type abstractions
- Union Types: A or B type unions for flexible APIs
- Intersection Types: A and B type combinations
- Match Types: Type-level computation and pattern matching
- Inline Methods: Compile-time evaluation and metaprogramming
- Given/Using: Context parameters replacing implicits
- Braceless Syntax: Optional significant indentation

---

## Context7 Library Mappings

### Core Scala

```
/scala/scala3 - Scala 3.4 language reference
/scala/scala-library - Standard library
```

### Effect Systems

```
/typelevel/cats-effect - Cats Effect 3.5 (Pure FP runtime)
/typelevel/cats - Cats 2.10 (Functional abstractions)
/zio/zio - ZIO 2.1 (Effect system)
/zio/zio-streams - ZIO Streams (Streaming)
```

### Akka Ecosystem

```
/akka/akka - Akka 2.9 (Typed actors, streams)
/akka/akka-http - Akka HTTP (REST APIs)
/akka/alpakka - Akka Alpakka (Connectors)
```

### HTTP and Web

```
/http4s/http4s - Http4s 0.24 (Functional HTTP)
/softwaremill/tapir - Tapir 1.10 (API-first design)
```

### JSON

```
/circe/circe - Circe 0.15 (JSON parsing)
/zio/zio-json - ZIO JSON 0.6 (Fast JSON)
```

### Database

```
/tpolecat/doobie - Doobie 1.0 (Functional JDBC)
/slick/slick - Slick 3.5 (FRM)
/getquill/quill - Quill 4.8 (Compile-time SQL)
```

### Big Data

```
/apache/spark - Apache Spark 3.5
/apache/flink - Apache Flink 1.19
/apache/kafka - Kafka Clients 3.7
```

### Testing

```
/scalatest/scalatest - ScalaTest 3.2
/typelevel/munit-cats-effect - MUnit Cats Effect 2.0
/zio/zio-test - ZIO Test 2.1
```

---

## Testing Patterns

### ScalaTest with Akka TestKit

```scala
class UserActorSpec extends ScalaTestWithActorTestKit with AnyWordSpecLike with Matchers:
  import UserActor.*

  val mockRepository: UserRepository = mock[UserRepository]

  "UserActor" should {
    "return user when found" in {
      val testUser = User(1L, "John", "[email protected]")
      when(mockRepository.findById(1L)).thenReturn(Some(testUser))

      val actor = spawn(UserActor(mockRepository))
      val probe = createTestProbe[Option[User]]()

      actor ! GetUser(1L, probe.ref)

      probe.expectMessage(Some(testUser))
      verify(mockRepository).findById(1L)
    }

    "return None when user not found" in {
      when(mockRepository.findById(999L)).thenReturn(None)

      val actor = spawn(UserActor(mockRepository))
      val probe = createTestProbe[Option[User]]()

      actor ! GetUser(999L, probe.ref)

      probe.expectMessage(None)
    }

    "handle multiple requests concurrently" in {
      val users = (1 to 100).map(i => User(i.toLong, s"User$i", s"[email protected]"))
      users.foreach(u => when(mockRepository.findById(u.id)).thenReturn(Some(u)))

      val actor = spawn(UserActor(mockRepository))
      val probes = users.map(_ => createTestProbe[Option[User]]())

      users.zip(probes).foreach { case (user, probe) =>
        actor ! GetUser(user.id, probe.ref)
      }

      users.zip(probes).foreach { case (user, probe) =>
        probe.expectMessage(Some(user))
      }
    }
  }
```

### Cats Effect Testing (MUnit)

```scala
class UserServiceSpec extends CatsEffectSuite:
  val mockRepository = mock[UserRepository[IO]]

  test("should fetch user successfully") {
    val testUser = User(1L, "John", "[email protected]")
    when(mockRepository.findById(1L)).thenReturn(IO.pure(Some(testUser)))

    val service = UserService(mockRepository)

    service.findById(1L).map { result =>
      assertEquals(result, Some(testUser))
    }
  }

  test("should handle concurrent operations") {
    val users = (1 to 10).map(i => User(i.toLong, s"User$i", s"[email protected]")).toList
    users.foreach(u => when(mockRepository.findById(u.id)).thenReturn(IO.pure(Some(u))))

    val service = UserService(mockRepository)

    val results = users.parTraverse(u => service.findById(u.id))

    results.map { list =>
      assertEquals(list.flatten.size, 10)
    }
  }

  test("should timeout slow operations") {
    when(mockRepository.findById(any[Long])).thenReturn(IO.sleep(5.seconds) *> IO.none)

    val service = UserService(mockRepository)

    service.findById(1L)
      .timeout(100.millis)
      .attempt
      .map { result =>
        assert(result.isLeft)
        assert(result.left.exists(_.isInstanceOf[TimeoutException]))
      }
  }
```

### ZIO Testing

```scala
object UserServiceSpec extends ZIOSpecDefault:
  val testUser = User(1L, "John", "[email protected]")

  val mockRepositoryLayer: ULayer[UserRepository] = ZLayer.succeed {
    new UserRepository:
      def findById(id: Long): UIO[Option[User]] =
        if id == 1L then ZIO.some(testUser) else ZIO.none
      def save(user: User): UIO[User] = ZIO.succeed(user)
  }

  def spec = suite("UserService")(
    test("should find existing user") {
      for
        service <- ZIO.service[UserService]
        result <- service.findById(1L)
      yield assertTrue(result == Some(testUser))
    }.provide(mockRepositoryLayer, UserService.layer),

    test("should return None for non-existent user") {
      for
        service <- ZIO.service[UserService]
        result <- service.findById(999L)
      yield assertTrue(result.isEmpty)
    }.provide(mockRepositoryLayer, UserService.layer),

    test("should handle parallel requests") {
      for
        service <- ZIO.service[UserService]
        results <- ZIO.foreachPar(1 to 100)(id => service.findById(id.toLong))
      yield assertTrue(results.flatten.size == 1)
    }.provide(mockRepositoryLayer, UserService.layer)
  )
```

### Property-Based Testing (ScalaCheck)

```scala
class UserValidationSpec extends AnyFlatSpec with Matchers with ScalaCheckPropertyChecks:
  "Email validation" should "accept valid emails" in {
    forAll(Gen.alphaNumStr, Gen.alphaNumStr) { (local, domain) =>
      whenever(local.nonEmpty && domain.nonEmpty) {
        val email = s"$local@$domain.com"
        Email(email) shouldBe a[Right[_, _]]
      }
    }
  }

  "UserId" should "roundtrip through string conversion" in {
    forAll(Gen.posNum[Long]) { id =>
      UserId.fromString(UserId(id).asString) shouldBe Some(UserId(id))
    }
  }
```

---

## Performance Characteristics

### JVM Startup and Memory

- Cold Start: 3-6s (JVM warmup)
- Warm Start: Less than 100ms
- Base Memory: 512MB or more
- GraalVM Native: Not recommended for Scala 3

### Compilation Times

- Clean Build: 60-120s
- Incremental: 10-30s
- With Cache: 30-60s
- Note: Scala 3 compiler is faster than Scala 2

### Framework Throughput

- Http4s (Blaze): 160K requests per second, P99 latency 1.5ms
- Http4s (Ember): 140K requests per second, P99 latency 2ms
- Akka HTTP: 180K requests per second, P99 latency 1.2ms
- ZIO HTTP: 170K requests per second, P99 latency 1.3ms

---

## Development Environment

### IDE Support

- IntelliJ IDEA: Good (Scala plugin required)
- VS Code: Good (Metals extension)
- Neovim: Good (Metals LSP)

### Recommended Plugins

IntelliJ IDEA:
- Scala (by JetBrains)
- ZIO for IntelliJ
- Cats Support

VS Code:
- Scala (Metals)
- Scala Syntax (official)

### Linters and Formatters

- Scalafmt: Code formatter (.scalafmt.conf)
- Scalafix: Linting and refactoring
- WartRemover: Code quality checks

Example .scalafmt.conf:
```hocon
version = 3.7.17
runner.dialect = scala3
maxColumn = 100
indent.main = 2
indent.callSite = 2
align.preset = more
rewrite.rules = [SortImports, RedundantBraces, PreferCurlyFors]
```

---

## Container Optimization

### Docker Multi-Stage Build

```dockerfile
FROM eclipse-temurin:21-jdk-alpine AS builder
WORKDIR /app
COPY . .
RUN sbt assembly

FROM eclipse-temurin:21-jre-alpine
RUN addgroup -g 1000 app && adduser -u 1000 -G app -s /bin/sh -D app
WORKDIR /app
COPY --from=builder /app/target/scala-3.4.2/*.jar app.jar
USER app
EXPOSE 8080
ENTRYPOINT ["java", "-jar", "app.jar"]
```

### JVM Tuning for Containers

```yaml
containers:
  - name: app
    image: myapp:latest
    resources:
      requests:
        memory: "512Mi"
        cpu: "500m"
      limits:
        memory: "1Gi"
        cpu: "1000m"
    env:
      - name: JAVA_OPTS
        value: >-
          -XX:+UseContainerSupport
          -XX:MaxRAMPercentage=75.0
          -XX:+UseG1GC
          -XX:+UseStringDeduplication
```

---

## Migration Guide: Scala 2.13 to 3.4

Key Changes:

1. Braceless Syntax: Optional significant indentation
2. Given/Using: Replace `implicit` with `given` and `using`
3. Extension Methods: Replace implicit classes with `extension`
4. Enums: Replace sealed traits with `enum`
5. Export Clauses: Replace trait mixing with exports
6. Opaque Types: Replace value classes with opaque types
7. Union Types: Replace Either with union types where appropriate
8. Match Types: Replace type-level programming patterns

Example Migration:

Scala 2.13:
```scala
implicit class StringOps(s: String) {
  def words: List[String] = s.split("\\s+").toList
}

implicit def jsonEncoder: JsonEncoder[String] = ???
```

Scala 3.4:
```scala
extension (s: String)
  def words: List[String] = s.split("\\s+").toList

given JsonEncoder[String] = ???
```

---

## Effect System Comparison

### Cats Effect vs ZIO

Cats Effect:
- Pure FP approach, minimal runtime
- Better interop with Typelevel ecosystem
- Smaller learning curve from cats-core
- Resource safety via Resource type

ZIO:
- Rich built-in functionality (layers, config, logging)
- Better error handling with typed errors
- Comprehensive testing utilities
- Larger standard library

### When to Use Which

Use Cats Effect When:
- Already using Typelevel libraries (http4s, doobie, fs2)
- Prefer minimal runtime overhead
- Team familiar with tagless final pattern

Use ZIO When:
- Building complex applications with many dependencies
- Need comprehensive error handling
- Prefer opinionated framework with batteries included
- Building applications from scratch

---

Last Updated: 2025-12-07
Version: 1.0.0

```

### examples.md

```markdown
# Scala Production Examples

## REST API Implementations

### Http4s Functional HTTP Service

```scala
// Main.scala
import cats.effect.*
import org.http4s.*
import org.http4s.dsl.io.*
import org.http4s.ember.server.*
import org.http4s.server.Router
import com.comcast.ip4s.*

object Main extends IOApp.Simple:
  def run: IO[Unit] =
    for
      config <- Config.load
      xa <- Database.transactor(config.database)
      repository = UserRepository.make(xa)
      service = UserService.make(repository)
      httpApp = Router(
        "/api/v1" -> UserRoutes(service).routes
      ).orNotFound
      _ <- EmberServerBuilder
        .default[IO]
        .withHost(config.server.host)
        .withPort(config.server.port)
        .withHttpApp(httpApp)
        .build
        .useForever
    yield ()

// UserRoutes.scala
import cats.effect.*
import org.http4s.*
import org.http4s.dsl.io.*
import org.http4s.circe.*
import io.circe.generic.auto.*

class UserRoutes(service: UserService[IO]) extends Http4sDsl[IO]:
  given EntityDecoder[IO, CreateUserRequest] = jsonOf[IO, CreateUserRequest]
  given EntityDecoder[IO, UpdateUserRequest] = jsonOf[IO, UpdateUserRequest]
  given EntityEncoder[IO, User] = jsonEncoderOf[IO, User]
  given EntityEncoder[IO, List[User]] = jsonEncoderOf[IO, List[User]]

  object PageParam extends OptionalQueryParamDecoderMatcher[Int]("page")
  object SizeParam extends OptionalQueryParamDecoderMatcher[Int]("size")

  val routes: HttpRoutes[IO] = HttpRoutes.of[IO] {
    case GET -> Root / "users" :? PageParam(page) +& SizeParam(size) =>
      for
        users <- service.findAll(page.getOrElse(0), size.getOrElse(20))
        response <- Ok(users)
      yield response

    case GET -> Root / "users" / LongVar(id) =>
      service.findById(id).flatMap {
        case Some(user) => Ok(user)
        case None => NotFound()
      }

    case req @ POST -> Root / "users" =>
      for
        request <- req.as[CreateUserRequest]
        result <- service.create(request).attempt
        response <- result match
          case Right(user) => Created(user)
          case Left(_: DuplicateEmailException) => Conflict()
          case Left(e) => InternalServerError(e.getMessage)
      yield response

    case req @ PUT -> Root / "users" / LongVar(id) =>
      for
        request <- req.as[UpdateUserRequest]
        result <- service.update(id, request)
        response <- result match
          case Some(user) => Ok(user)
          case None => NotFound()
      yield response

    case DELETE -> Root / "users" / LongVar(id) =>
      service.delete(id).flatMap {
        case true => NoContent()
        case false => NotFound()
      }
  }

// UserService.scala
trait UserService[F[_]]:
  def findAll(page: Int, size: Int): F[List[User]]
  def findById(id: Long): F[Option[User]]
  def create(request: CreateUserRequest): F[User]
  def update(id: Long, request: UpdateUserRequest): F[Option[User]]
  def delete(id: Long): F[Boolean]

object UserService:
  def make(repository: UserRepository[IO]): UserService[IO] = new UserService[IO]:
    def findAll(page: Int, size: Int): IO[List[User]] =
      repository.findAll(page * size, size)

    def findById(id: Long): IO[Option[User]] =
      repository.findById(id)

    def create(request: CreateUserRequest): IO[User] =
      for
        exists <- repository.existsByEmail(request.email)
        _ <- IO.raiseWhen(exists)(DuplicateEmailException(request.email))
        passwordHash = BCrypt.hashpw(request.password, BCrypt.gensalt())
        user = User(0, request.name, request.email, passwordHash, UserStatus.Pending, Instant.now)
        saved <- repository.save(user)
      yield saved

    def update(id: Long, request: UpdateUserRequest): IO[Option[User]] =
      repository.findById(id).flatMap {
        case Some(existing) =>
          val checkEmail = request.email.filter(_ != existing.email).traverse_ { email =>
            repository.existsByEmail(email).flatMap { exists =>
              IO.raiseWhen(exists)(DuplicateEmailException(email))
            }
          }
          val updated = existing.copy(
            name = request.name,
            email = request.email.getOrElse(existing.email)
          )
          checkEmail *> repository.update(updated).map(Some(_))
        case None => IO.pure(None)
      }

    def delete(id: Long): IO[Boolean] =
      repository.delete(id)

// UserRepository.scala (Doobie)
trait UserRepository[F[_]]:
  def findAll(offset: Int, limit: Int): F[List[User]]
  def findById(id: Long): F[Option[User]]
  def findByEmail(email: String): F[Option[User]]
  def existsByEmail(email: String): F[Boolean]
  def save(user: User): F[User]
  def update(user: User): F[User]
  def delete(id: Long): F[Boolean]

object UserRepository:
  def make(xa: Transactor[IO]): UserRepository[IO] = new UserRepository[IO]:
    import doobie.*
    import doobie.implicits.*
    import doobie.postgres.implicits.*

    def findAll(offset: Int, limit: Int): IO[List[User]] =
      sql"""
        SELECT id, name, email, password_hash, status, created_at
        FROM users
        ORDER BY created_at DESC
        LIMIT $limit OFFSET $offset
      """.query[User].to[List].transact(xa)

    def findById(id: Long): IO[Option[User]] =
      sql"""
        SELECT id, name, email, password_hash, status, created_at
        FROM users WHERE id = $id
      """.query[User].option.transact(xa)

    def findByEmail(email: String): IO[Option[User]] =
      sql"""
        SELECT id, name, email, password_hash, status, created_at
        FROM users WHERE email = $email
      """.query[User].option.transact(xa)

    def existsByEmail(email: String): IO[Boolean] =
      sql"SELECT EXISTS(SELECT 1 FROM users WHERE email = $email)"
        .query[Boolean].unique.transact(xa)

    def save(user: User): IO[User] =
      sql"""
        INSERT INTO users (name, email, password_hash, status, created_at)
        VALUES (${user.name}, ${user.email}, ${user.passwordHash}, ${user.status}, ${user.createdAt})
      """.update.withUniqueGeneratedKeys[Long]("id")
        .map(id => user.copy(id = id))
        .transact(xa)

    def update(user: User): IO[User] =
      sql"""
        UPDATE users SET name = ${user.name}, email = ${user.email}
        WHERE id = ${user.id}
      """.update.run.transact(xa).as(user)

    def delete(id: Long): IO[Boolean] =
      sql"DELETE FROM users WHERE id = $id".update.run.transact(xa).map(_ > 0)

// Models.scala
import io.circe.*
import java.time.Instant

case class User(
  id: Long,
  name: String,
  email: String,
  passwordHash: String,
  status: UserStatus,
  createdAt: Instant
) derives Encoder.AsObject, Decoder

enum UserStatus derives Encoder, Decoder:
  case Pending, Active, Suspended

case class CreateUserRequest(
  name: String,
  email: String,
  password: String
) derives Decoder

case class UpdateUserRequest(
  name: String,
  email: Option[String] = None
) derives Decoder

class DuplicateEmailException(email: String)
  extends RuntimeException(s"Email already exists: $email")
```

---

## Big Data Examples

### Spark 3.5 Analytics

```scala
// UserAnalytics.scala
import org.apache.spark.sql.{DataFrame, SparkSession}
import org.apache.spark.sql.functions.*

object UserAnalytics:
  def main(args: Array[String]): Unit =
    val spark = SparkSession.builder()
      .appName("User Analytics")
      .config("spark.sql.adaptive.enabled", "true")
      .config("spark.sql.shuffle.partitions", "200")
      .getOrCreate()

    import spark.implicits.*

    val users = spark.read.parquet("s3://data/users")
    val orders = spark.read.parquet("s3://data/orders")
    val events = spark.read.parquet("s3://data/events")

    // User lifetime value analysis
    val userLtv = calculateUserLtv(users, orders)

    // User engagement metrics
    val engagement = calculateEngagement(users, events)

    // Cohort analysis
    val cohorts = performCohortAnalysis(users, orders)

    userLtv.write.parquet("s3://output/user-ltv")
    engagement.write.parquet("s3://output/user-engagement")
    cohorts.write.parquet("s3://output/cohorts")

    spark.stop()

  def calculateUserLtv(users: DataFrame, orders: DataFrame): DataFrame =
    orders
      .groupBy("user_id")
      .agg(
        sum("amount").as("total_spent"),
        count("*").as("order_count"),
        avg("amount").as("avg_order_value"),
        min("created_at").as("first_order"),
        max("created_at").as("last_order")
      )
      .join(users, Seq("user_id"), "left")
      .withColumn("days_as_customer",
        datediff(col("last_order"), col("first_order")))
      .withColumn("ltv_score",
        col("total_spent") * (col("order_count") / (col("days_as_customer") + 1)))

  def calculateEngagement(users: DataFrame, events: DataFrame): DataFrame =
    events
      .filter(col("event_date") >= date_sub(current_date(), 30))
      .groupBy("user_id")
      .agg(
        countDistinct("session_id").as("sessions"),
        count("*").as("total_events"),
        sum(when(col("event_type") === "page_view", 1).otherwise(0)).as("page_views"),
        sum(when(col("event_type") === "click", 1).otherwise(0)).as("clicks")
      )
      .join(users, Seq("user_id"), "left")
      .withColumn("engagement_score",
        (col("sessions") * 0.3) + (col("page_views") * 0.2) + (col("clicks") * 0.5))

  def performCohortAnalysis(users: DataFrame, orders: DataFrame): DataFrame =
    val usersWithCohort = users
      .withColumn("cohort_month", date_trunc("month", col("created_at")))

    val ordersWithPeriod = orders
      .withColumn("order_month", date_trunc("month", col("created_at")))

    usersWithCohort
      .join(ordersWithPeriod, "user_id")
      .withColumn("period_number",
        months_between(col("order_month"), col("cohort_month")).cast("int"))
      .groupBy("cohort_month", "period_number")
      .agg(
        countDistinct("user_id").as("users"),
        sum("amount").as("revenue")
      )
      .orderBy("cohort_month", "period_number")
```

### Akka Streams Processing

```scala
// StreamProcessing.scala
import akka.actor.typed.ActorSystem
import akka.actor.typed.scaladsl.Behaviors
import akka.stream.scaladsl.*
import akka.stream.alpakka.kafka.scaladsl.*
import akka.kafka.{ConsumerSettings, ProducerSettings}
import org.apache.kafka.common.serialization.*
import scala.concurrent.duration.*

object StreamProcessing:
  given system: ActorSystem[Nothing] = ActorSystem(Behaviors.empty, "stream-system")
  given ec: ExecutionContext = system.executionContext

  val consumerSettings = ConsumerSettings(system, new StringDeserializer, new ByteArrayDeserializer)
    .withBootstrapServers("localhost:9092")
    .withGroupId("processor-group")

  val producerSettings = ProducerSettings(system, new StringSerializer, new ByteArraySerializer)
    .withBootstrapServers("localhost:9092")

  def processEvents(): Future[Done] =
    Consumer
      .plainSource(consumerSettings, Subscriptions.topics("user-events"))
      .map(record => parseEvent(record.value()))
      .filter(_.isValid)
      .mapAsync(4)(enrichEvent)
      .groupedWithin(100, 5.seconds)
      .mapAsync(2)(batchProcess)
      .map(result => new ProducerRecord[String, Array[Byte]](
        "processed-events", result.key, result.toByteArray))
      .runWith(Producer.plainSink(producerSettings))

  def parseEvent(bytes: Array[Byte]): Event =
    Event.parseFrom(bytes)

  def enrichEvent(event: Event): Future[EnrichedEvent] =
    for
      userInfo <- userService.getUser(event.userId)
      geoInfo <- geoService.lookup(event.ipAddress)
    yield EnrichedEvent(event, userInfo, geoInfo)

  def batchProcess(events: Seq[EnrichedEvent]): Future[BatchResult] =
    analyticsService.processBatch(events)
```

---

## Event Sourcing with Akka

```scala
// UserAggregate.scala
import akka.actor.typed.*
import akka.actor.typed.scaladsl.*
import akka.persistence.typed.*
import akka.persistence.typed.scaladsl.*
import java.time.Instant

object UserAggregate:
  sealed trait Command
  case class CreateUser(name: String, email: String, replyTo: ActorRef[Response]) extends Command
  case class UpdateEmail(email: String, replyTo: ActorRef[Response]) extends Command
  case class Deactivate(replyTo: ActorRef[Response]) extends Command
  case class GetState(replyTo: ActorRef[Option[User]]) extends Command

  sealed trait Event
  case class UserCreated(id: String, name: String, email: String, at: Instant) extends Event
  case class EmailUpdated(email: String, at: Instant) extends Event
  case class UserDeactivated(at: Instant) extends Event

  sealed trait Response
  case class Success(user: User) extends Response
  case class Failure(reason: String) extends Response

  case class User(
    id: String,
    name: String,
    email: String,
    status: UserStatus,
    createdAt: Instant,
    updatedAt: Instant
  )

  enum UserStatus:
    case Active, Deactivated

  def apply(id: String): Behavior[Command] =
    EventSourcedBehavior[Command, Event, Option[User]](
      persistenceId = PersistenceId("User", id),
      emptyState = None,
      commandHandler = commandHandler(id),
      eventHandler = eventHandler
    ).withRetention(RetentionCriteria.snapshotEvery(100, 2))

  private def commandHandler(id: String)(state: Option[User], cmd: Command): Effect[Event, Option[User]] =
    state match
      case None => handleNew(id, cmd)
      case Some(user) if user.status == UserStatus.Active => handleActive(user, cmd)
      case Some(_) => handleDeactivated(cmd)

  private def handleNew(id: String, cmd: Command): Effect[Event, Option[User]] =
    cmd match
      case CreateUser(name, email, replyTo) =>
        val event = UserCreated(id, name, email, Instant.now)
        Effect
          .persist(event)
          .thenRun(state => replyTo ! Success(state.get))
      case other: Command =>
        other match
          case GetState(replyTo) => replyTo ! None
          case CreateUser(_, _, replyTo) => replyTo ! Failure("Unexpected")
          case UpdateEmail(_, replyTo) => replyTo ! Failure("User does not exist")
          case Deactivate(replyTo) => replyTo ! Failure("User does not exist")
        Effect.none

  private def handleActive(user: User, cmd: Command): Effect[Event, Option[User]] =
    cmd match
      case UpdateEmail(email, replyTo) =>
        Effect
          .persist(EmailUpdated(email, Instant.now))
          .thenRun(state => replyTo ! Success(state.get))
      case Deactivate(replyTo) =>
        Effect
          .persist(UserDeactivated(Instant.now))
          .thenRun(state => replyTo ! Success(state.get))
      case GetState(replyTo) =>
        replyTo ! Some(user)
        Effect.none
      case CreateUser(_, _, replyTo) =>
        replyTo ! Failure("User already exists")
        Effect.none

  private def handleDeactivated(cmd: Command): Effect[Event, Option[User]] =
    cmd match
      case GetState(replyTo) =>
        Effect.none.thenRun(state => replyTo ! state)
      case CreateUser(_, _, replyTo) =>
        replyTo ! Failure("User is deactivated")
        Effect.none
      case UpdateEmail(_, replyTo) =>
        replyTo ! Failure("User is deactivated")
        Effect.none
      case Deactivate(replyTo) =>
        replyTo ! Failure("User is deactivated")
        Effect.none

  private val eventHandler: (Option[User], Event) => Option[User] = (state, event) =>
    event match
      case UserCreated(id, name, email, at) =>
        Some(User(id, name, email, UserStatus.Active, at, at))
      case EmailUpdated(email, at) =>
        state.map(_.copy(email = email, updatedAt = at))
      case UserDeactivated(at) =>
        state.map(_.copy(status = UserStatus.Deactivated, updatedAt = at))
```

---

## ZIO Application

### Complete ZIO Service

```scala
// Main.scala
import zio.*
import zio.http.*
import zio.json.*

object Main extends ZIOAppDefault:
  def run =
    Server.serve(UserApp.routes)
      .provide(
        Server.default,
        UserServiceLive.layer,
        UserRepositoryLive.layer,
        Database.layer
      )

// UserApp.scala
object UserApp:
  val routes: Routes[UserService, Nothing] = Routes(
    Method.GET / "users" -> handler { (req: Request) =>
      for
        service <- ZIO.service[UserService]
        users <- service.findAll(0, 20)
      yield Response.json(users.toJson)
    },

    Method.GET / "users" / long("id") -> handler { (id: Long, req: Request) =>
      for
        service <- ZIO.service[UserService]
        user <- service.findById(id)
        response <- user match
          case Some(u) => ZIO.succeed(Response.json(u.toJson))
          case None => ZIO.succeed(Response.status(Status.NotFound))
      yield response
    },

    Method.POST / "users" -> handler { (req: Request) =>
      for
        body <- req.body.asString
        request <- ZIO.fromEither(body.fromJson[CreateUserRequest])
          .mapError(e => Response.text(e).status(Status.BadRequest))
        service <- ZIO.service[UserService]
        user <- service.create(request)
      yield Response.json(user.toJson).status(Status.Created)
    }
  )

// UserService.scala
trait UserService:
  def findAll(page: Int, size: Int): Task[List[User]]
  def findById(id: Long): Task[Option[User]]
  def create(request: CreateUserRequest): Task[User]

case class UserServiceLive(repository: UserRepository) extends UserService:
  def findAll(page: Int, size: Int): Task[List[User]] =
    repository.findAll(page * size, size)

  def findById(id: Long): Task[Option[User]] =
    repository.findById(id)

  def create(request: CreateUserRequest): Task[User] =
    for
      exists <- repository.existsByEmail(request.email)
      _ <- ZIO.fail(new Exception("Email exists")).when(exists)
      user = User(0, request.name, request.email, UserStatus.Pending)
      saved <- repository.save(user)
    yield saved

object UserServiceLive:
  val layer: ZLayer[UserRepository, Nothing, UserService] =
    ZLayer.fromFunction(UserServiceLive.apply)

// Models.scala
import zio.json.*

case class User(
  id: Long,
  name: String,
  email: String,
  status: UserStatus
) derives JsonEncoder, JsonDecoder

enum UserStatus derives JsonEncoder, JsonDecoder:
  case Pending, Active, Suspended

case class CreateUserRequest(
  name: String,
  email: String
) derives JsonDecoder
```

---

## Build Configuration

### Multi-Project SBT

```scala
// build.sbt
ThisBuild / scalaVersion := "3.4.2"
ThisBuild / organization := "com.example"
ThisBuild / version := "1.0.0"

lazy val commonSettings = Seq(
  scalacOptions ++= Seq(
    "-deprecation",
    "-feature",
    "-unchecked",
    "-Xfatal-warnings"
  )
)

lazy val root = (project in file("."))
  .aggregate(core, api, analytics)
  .settings(
    name := "scala-microservices"
  )

lazy val core = (project in file("core"))
  .settings(commonSettings)
  .settings(
    name := "core",
    libraryDependencies ++= Seq(
      "org.typelevel" %% "cats-effect" % "3.5.4",
      "io.circe" %% "circe-generic" % "0.15.0",
      "org.scalatest" %% "scalatest" % "3.2.18" % Test
    )
  )

lazy val api = (project in file("api"))
  .dependsOn(core)
  .settings(commonSettings)
  .settings(
    name := "api",
    libraryDependencies ++= Seq(
      "org.http4s" %% "http4s-ember-server" % "0.24.0",
      "org.http4s" %% "http4s-circe" % "0.24.0",
      "org.http4s" %% "http4s-dsl" % "0.24.0",
      "org.tpolecat" %% "doobie-core" % "1.0.0-RC4",
      "org.tpolecat" %% "doobie-postgres" % "1.0.0-RC4"
    )
  )

lazy val analytics = (project in file("analytics"))
  .dependsOn(core)
  .settings(commonSettings)
  .settings(
    name := "analytics",
    libraryDependencies ++= Seq(
      "org.apache.spark" %% "spark-sql" % "3.5.0" % Provided,
      "io.delta" %% "delta-spark" % "3.0.0"
    )
  )
```

---

Last Updated: 2025-12-07
Version: 1.0.0

```

moai-lang-scala | SkillHub