Best of Machine LearningDecember 2024

  1. 1
    Article
    Avatar of pyimagesearchPyImageSearch·1y

    PNG Image to STL Converter in Python

    Learn how to convert a PNG image to an STL file using TripoSR in Python. This guide walks through setting up the environment, importing necessary libraries, processing the image to create a 3D model, and converting the model from OBJ to STL format. Ideal for designers, engineers, or hobbyists aiming to create 3D printable objects from 2D images.

  2. 2
    Article
    Avatar of hnHacker News·1y

    How I run LLMs locally

    Running LLMs locally can be achieved with various open-source tools on a powerful computer with a core i9 CPU, 4090 GPU, and 96 GB RAM. LLMs performance varies based on model size and hardware specifications. Tools like Ollama, Open WebUI, and llamafile are used for running models, while AUTOMATIC1111 and Fooocus are preferred for image generation. Code completion is enhanced with Continue in VSCode, and Smart Connections in Obsidian assists with managing model updates. Keeping up with LLM advancements is crucial due to their rapid development.

  3. 3
    Article
    Avatar of mlmMachine Learning Mastery·1y

    7 Machine Learning Projects For Beginners

    Explore seven beginner-friendly machine learning projects to gain real-world experience and enhance your career prospects. Projects include Titanic Survival Prediction, Stock Price Prediction, Email Spam Classifier, Handwritten Digit Recognition, Movie Recommendation System, Customer Churn Prediction, and Face Detection. These projects will teach you important ML skills such as data preparation, classification, regression, computer vision, and natural language processing.

  4. 4
    Article
    Avatar of dailydoseofdsDaily Dose of Data Science | Avi Chawla | Substack·1y

    Building a 100% Local mini-ChatGPT

    A guide on building a local mini-ChatGPT app using the Llama3.2-vision model and Chainlit. The post includes a demo, necessary tools, and step-by-step coding instructions with multimodal prompting. The code and further resources for AI engineering are provided on GitHub.

  5. 5
    Video
    Avatar of fireshipFireship·1y

    Devin just came to take your software job… will code for $8/hr

    Devon is a fully automated junior engineer that can build, test, and ship code for $500 per month, equivalent to $8 per hour. Despite concerns about its capabilities and initial launch issues, Devon is positioned to replace human programmers in enterprise settings. Users interact with Devon through Slack, where it can perform tasks and integrate with tools like GitHub. While it is impressive, Devon still exhibits typical AI shortcomings, such as making unnecessary changes or confusing explanations. The post also highlights new AI tools from OpenAI and Google, along with the introduction of PGVectorizer from Timescale for simplifying AI system development.

  6. 6
    Article
    Avatar of devtoDEV·1y

    Cracking AWS Certifications: A Guide From Beginner to Pro

    2024 has been a year filled with successful completion of AWS certifications. Key insights include the importance of understanding personal learning styles, hands-on experience, and practical application of knowledge. Each certification from AWS Certified Cloud Practitioner to AWS Certified Developer - Associate offers unique challenges and areas of focus, from general cloud knowledge to specific roles like security, DevOps, and machine learning. Planning the sequence of exams and understanding fundamental AWS concepts are critical for success. The journey emphasizes growth through continuous learning and effective preparation.

  7. 7
    Article
    Avatar of communityCommunity Picks·1y

    Welcome to Langflow

    Langflow is an open-source, Python-powered framework designed for building multi-agent and Retrieval Augmented Generation (RAG) applications. It features an intuitive visual flow builder that allows developers to create complex AI workflows with ease. Suitable for both seasoned AI developers and beginners, Langflow supports a wide range of applications like intelligent chatbots, document analysis systems, and content generation. Join the community to share projects and seek support.

  8. 8
    Article
    Avatar of devtoDEV·1y

    Llama 3.3 vs OpenAI O1

    Llama 3.3 and OpenAI O1 are two advanced AI models offering enhanced reasoning, scalability, and versatile applications. Llama 3.3 stands out with its open-source flexibility and cost-effective solution, while OpenAI O1 offers a user-friendly API and robust security. Apidog is recommended for integrating these AI models, simplifying API development with its intuitive interface.

  9. 9
    Article
    Avatar of mlmMachine Learning Mastery·1y

    The Ultimate Guide to Building a Machine Learning Portfolio That Lands Jobs

    Building a compelling machine learning portfolio is crucial for standing out in the competitive job market. Create a diverse portfolio showcasing various projects with different machine learning techniques, including handling both structured and unstructured data. Document each project thoroughly, highlighting your problem-solving ability, data preprocessing steps, feature engineering, model selection, and evaluation metrics. Choose appropriate platforms such as GitHub, Streamlit, or HuggingFace Spaces to host your portfolio, and enhance your profile by writing detailed blog posts about your findings.

  10. 10
    Article
    Avatar of taiTowards AI·1y

    Build Your LLM Engineer Portfolio: A 3-Month Roadmap

    A step-by-step guide to designing, refining, and showcasing a portfolio tailored for aspiring LLM engineers. Highlights the importance of practical, hands-on projects to stand out in the competitive AI job market and offers insights from the author's experience in crafting sophisticated GenAI applications and designing comprehensive product solutions.

  11. 11
    Article
    Avatar of dailydoseofdsDaily Dose of Data Science | Avi Chawla | Substack·1y

    RAG vs Agentic RAG

    Agentic RAG systems introduce dynamic, adaptable behaviors into the traditional RAG workflow. Unlike traditional RAG, which retrieves and generates once, agentic RAGs iteratively refine queries and context, adapting based on the problem's complexity. This makes them more effective for complex queries and problem-solving. The open-source tool Opik by CometML supports the evaluation, testing, and monitoring of LLM applications from development to production, offering features like logging traces and detecting hallucinations.

  12. 12
    Article
    Avatar of dailydoseofdsDaily Dose of Data Science | Avi Chawla | Substack·1y

    A crash course on RAG systems—Part 5

    Part 5 of the RAG crash course focuses on the implementation of key components for multimodal RAG systems, such as CLIP embeddings, multimodal prompting, and tool calling. The series aims to educate readers on building reliable RAG systems that can reduce costs and handle complex data types, ultimately aiding businesses in achieving greater impact.

  13. 13
    Article
    Avatar of swirlaiSwirlAI·1y

    Building AI Agents from scratch - Part 1: Tool use

    Learn how to build AI agents from scratch, focusing on implementing tool usage capabilities without any orchestration frameworks. The guide covers creating Python functions as tools, constructing effective system prompts, and developing an Agent class to plan and execute actions using provided tools. The tutorial includes detailed code examples and explanations for wrapping functions as tools, formatting prompts, and executing tasks effectively.

  14. 14
    Article
    Avatar of taiTowards AI·1y

    10 No-Nonsense Machine Learning Tips for Beginners (Using Real-World Datasets)

    Get practical with machine learning by starting with simple models like Linear Regression and Decision Trees using real-world datasets from the UCI Machine Learning Repository. Focus on hands-on experimentation to build a strong foundation before diving into more complex models like neural networks.

  15. 15
    Article
    Avatar of communityCommunity Picks·1y

    The 6 Best LLM Tools To Run Models Locally

    Running large language models (LLMs) locally offers enhanced data privacy, customization, and cost savings. This post covers the top six tools (LM Studio, Jan, Llamafile, GPT4ALL, Ollama, and LLaMa.cpp) that allow developers to run LLMs offline on Mac, Windows, and Linux platforms, providing features like advanced model customization, offline support, and integration with popular AI providers. Local LLM tools are useful in various scenarios, including telehealth and no-internet locations, ensuring data does not leave the local device.

  16. 16
    Video
    Avatar of youtubeYouTube·1y

    Data Science Full Course - Complete Data Science Course | Data Science Full Course For Beginners IBM

    Data science is a rapidly growing field with significant career opportunities due to the massive amounts of data produced and advancements in computing power and artificial intelligence. The course from IBM introduces key concepts and skills necessary for starting a career in data science, including big data, artificial intelligence, and cloud computing. It provides instructional videos, readings, practice assessments, and insights from data science professionals, concluding with a case study and a final peer-reviewed project.

  17. 17
    Article
    Avatar of huggingfaceHugging Face·1y

    Visualize and understand GPU memory in PyTorch

    This tutorial explains how to visualize and understand GPU memory usage in PyTorch during model training. It provides step-by-step instructions on generating and interpreting memory profiles using PyTorch's built-in tools. The tutorial also covers how to estimate and optimize memory requirements for training large models, offering practical tips to manage GPU memory efficiently.

  18. 18
    Article
    Avatar of communityCommunity Picks·1y

    Rust learning resources

    Rust is not just for system programming; it is versatile and can be used for a variety of projects. Avoid paid courses and utilize free resources like books and YouTube channels to learn Rust. Start with foundational books like 'The Rust Book,' then explore practical projects through resources such as the 'Practical Rust Projects' and 'Creative Projects for Rust Programmers.' Specialized areas like web development, command line utilities, and machine learning are also covered in specific books. Use repositories like 'Awesome Embedded Rust' and 'Idiomatic Rust' for further learning and updates.

  19. 19
    Article
    Avatar of tdsTowards Data Science·1y

    How to Build a Graph RAG App

    Learn how to build a Graph RAG (Retrieval-Augmented Generation) app that uses knowledge graphs and large language models to retrieve, filter, and summarize medical journal articles. The app incorporates vector databases for initial searches and structured knowledge graph metadata for filtering and organization, leveraging the MeSH controlled vocabulary to ensure relevant results. This approach enhances accuracy, explainability, and domain-specific knowledge retrieval, applicable to various fields beyond medicine.

  20. 20
    Article
    Avatar of baeldungBaeldung·1y

    Implementing an AI Assistant with Spring AI

    This tutorial delves into the features of Spring AI to create an AI assistant using LLMs like ChatGPT. It highlights the key functionalities, including context-aware response generation, structured output conversion, and integrating with Vector DBs. The process involves setting up necessary dependencies, creating relevant tables, and implementing callback functions. Common concerns like data privacy and maintaining conversational states are addressed using Advisors APIs. Examples demonstrate how to build a chatbot in a legacy Order Management System, showcasing practical applications of these concepts.

  21. 21
    Article
    Avatar of dailydoseofdsDaily Dose of Data Science | Avi Chawla | Substack·1y

    A crash course on RAG systems—Part 6

    Part 6 of the crash course on RAG systems explores how to build a more extensive and capable multimodal RAG system using CLIP embeddings, multimodal prompting, and tool calling. The post includes a unique dataset combining social media posts with images to provide a practical learning experience. The series covers everything from foundational components and evaluation to optimization and handling complex documents, aiming to help users implement reliable RAG systems and solve key NLP challenges with LLMs.

  22. 22
    Article
    Avatar of mlnewsMachine Learning News·1y

    Meet MegaParse: An Open-Source AI Tool for Parsing Various Types of Documents for LLM Ingestion

    MegaParse is an open-source tool designed to efficiently parse various types of documents (PDF, Word, Excel, CSV, etc.) for ingestion into large language models (LLMs). It saves users significant time and effort by automating the conversion process while retaining information integrity. The tool is highly versatile, handling different document elements such as tables and images, and supports customizable output formats. Installation is straightforward via pip, with additional setups for dependencies like Poppler, Tesseract, and libmagic. MegaParse also provides advanced usage options and benchmarking capabilities, making it a reliable choice for developers and enterprises looking to streamline their AI data pipeline.

  23. 23
    Article
    Avatar of mlmMachine Learning Mastery·1y

    5 Tools for Visualizing Machine Learning Models

    Machine learning models require specialized tools to visualize their structure, performance, and behavior. Five useful tools for this purpose include TensorBoard for neural network models, SHAP for model prediction explanations, Yellowbrick for Python-based model diagnostics, Netron for deep learning model architecture visualization, and LIME for intuitive model explanations. These tools cater to various model types and use cases, helping users understand complex ML models better.

  24. 24
    Video
    Avatar of TechWithTimTech With Tim·1y

    ADVANCED Python AI Agent Tutorial - Using RAG, Langflow & Multi-Agents

    Learn how to build a multi-AI agent application using Langflow to handle complex tasks like customer support. This tutorial demonstrates step-by-step integration, including implementing retrieval augmented generation (RAG) for responsive data lookup from a database and combining multiple AI agents to solve real-world business cases effectively. The guide shows how to set up a front end with Streamlit and offers ways to extend and scale the system for practical use.

  25. 25
    Article
    Avatar of tdsTowards Data Science·1y

    Multi-Agentic RAG with Hugging Face Code Agents

    The post provides a detailed tutorial on using a small open-source large language model, Qwen2.5–7B-Instruct, to create a local multi-agentic RAG (Retrieval Augmented Generation) system using Hugging Face code agents. It explains the architecture and functionality of multi-agent systems, particularly code agents, and their advantages. Key details include the ReAct framework for LLM agents, the specific roles of manager, Wikipedia search, and page search agents, and security measures for code execution.