User Tools

Site Tools


start

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
start [2025/07/26 06:47] – [Experiment for Job] skipidarstart [2025/07/28 11:21] (current) skipidar
Line 9: Line 9:
   * Find and do certificates   * Find and do certificates
 </wrap>| </wrap>|
 +
 +====== CTO certification ======
 +
 +
 +
 + - https://thectoclub.com/news/best-cto-courses/
  
 ====== Write and publish ====== ====== Write and publish ======
Line 60: Line 66:
       - enables loose coupling of modules       - enables loose coupling of modules
  
 +====== Local AI ======
 + * LocalAI:
 +   * Description: LocalAI is a fantastic open-source, free, and self-hosted alternative to OpenAI, providing a drop-in replacement REST API for local inference. It supports a wide range of models, including GGUF, and can pull models from various sources, including OCI registries (like Ollama's OCI registry) and Hugging Face.
 +   * How to get it: You typically run LocalAI itself within a Docker container.
 +     docker run -ti --name local-ai -p 8080:8080 localai/localai:latest
 +
 +     (Use a GPU-specific image like localai/localai:latest-gpu-nvidia-cuda-12 if you have an NVIDIA GPU).
 +   * Usage: Once LocalAI is running, you can interact with it via its OpenAI-compatible API (e.g., using curl or Python's openai library with base_url pointed to LocalAI). You can tell LocalAI to pull models directly from an OCI URI, for example:
 +     local-ai run ollama://gemma:2b
 +
 +   * Pros: Very flexible, supports many model types and backends, provides an OpenAI-compatible API, robust community.
 +   * Cons: Requires running a separate LocalAI container, might be overkill if you just want simple CLI interaction.
 + * Ollama:
 +   * Description: Ollama is another popular, free, and open-source tool designed specifically for running LLMs locally. It's incredibly user-friendly and handles much of the complexity (like GGUF conversion, quantization, and OCI packaging) behind the scenes. While it uses its own "Ollama registry," it effectively packages GGUF models in an OCI-like fashion.
 +   * How to get it: Ollama provides a simple installation script for Linux:
 +     curl -fsSL https://ollama.com/install.sh | sh
 +
 +   * Usage:
 +     * To pull and run a model: ollama run gemma:2b
 +     * Ollama also exposes an API at http://localhost:11434 which is becoming a widely adopted standard for local LLM APIs.
 +   * Pros: Extremely easy to use, handles all GGUF and OCI packaging complexities, growing model library, good community support, CLI and API.
 +   * Cons: While it uses OCI principles for distribution, it operates with its own specific "Ollama" model names rather than direct generic OCI registry paths for models.
 + * RamaLama:
 +   * Description: RamaLama is an open-source tool that aims to simplify the local serving of AI models from various sources, including OCI Container Registries, using familiar container concepts (Docker/Podman). It automates the detection of your system's GPU support and pulls appropriate OCI images.
 +   * How to get it: It's a command-line tool, likely installed from source or pre-built binaries, and relies on a container engine (Docker or Podman) being present.
 +   * Pros: Focus on container-native workflow, handles GPU detection, supports multiple AI model registries.
 +   * Cons: Newer project, might not have as extensive a feature set or community as llama.cpp derivatives or Ollama yet.
 +Key considerations when choosing:
 + * Ease of Use: If you want the simplest experience, Ollama is hard to beat.
 + * Flexibility & Control: LocalAI offers a high degree of flexibility with its API and support for various backends.
 +
 +=====Spring Boot MCP======
 +
 +
 +https://www.infoq.com/articles/spring-ai-1-0/
 +
 +
 +
 +====== AWS AI/ML Certifications ======
 + * AWS Certified Machine Learning – Specialty: https://aws.amazon.com/certification/certified-machine-learning-specialty/
 + * AWS Certified AI Practitioner: https://aws.amazon.com/certification/certified-ai-practitioner/
 + * AWS Certified Data Analytics – Specialty: https://aws.amazon.com/certification/certified-data-analytics-specialty/
 +
 +
 +
 +====== Microsoft Azure certifications ======
 +
 +Generative AI for Beginners | Microsoft Learn https://learn.microsoft.com/en-us/shows/generative-ai-for-beginners/
 +
 + * Microsoft Certified: Azure AI Engineer Associate
 +   * Link: https://learn.microsoft.com/en-us/credentials/certifications/azure-ai-engineer/
 + * Microsoft Certified: Azure Data Scientist Associate
 +   * Link: https://learn.microsoft.com/en-us/credentials/certifications/azure-data-scientist/
 + * Microsoft Certified: Azure AI Fundamentals
 +   * Link: https://learn.microsoft.com/en-us/credentials/certifications/azure-ai-fundamentals/
 + * Microsoft Certified: Azure Data Engineer Associate
 +   * Link: https://learn.microsoft.com/en-us/credentials/certifications/azure-data-engineer/
 + * Microsoft Certified: Data Analyst Associate
 +   * Link: https://learn.microsoft.com/en-us/credentials/certifications/data-analyst-associate/
 +   
 +======Spring AI Content:======
 + * Spring AI Project Page: https://spring.io/projects/spring-ai/
 + * AmigosCode Spring AI Course: https://amigoscode.com/courses/spring-boot/spring-ai
 + * Developing Generative AI Applications with Spring (Ascendient Learning): https://www.ascendientlearning.com/it-training/generative-ai/genai-with-spring-70641-detail.html
 + * Generative AI for Java and Spring Development (Coursera): https://www.coursera.org/learn/generative-ai-for-java-and-spring-development
 + * Class Central - Spring AI Courses (Collection of Udemy, YouTube, etc.): https://www.classcentral.com/subject/spring-ai
  
  
start.1753512465.txt.gz · Last modified: by skipidar