Freelance sql jobs for professional developers! You do not need to waste your precious time searching for work. See salaries, easily apply, and get hired! Landing high paying sql freelance jobs can be difficult, but it does not have to be. Discover your next career move with Vollna! Freelance sql jobs online offer you a great option for working remotely from your home. Create an account with us now for free!
Job Title | Budget | ||||
---|---|---|---|---|---|
Marketing Data Analyst – Direct Mail Pivot Table
|
not specified | 9 hours ago |
Client Rank
- Risky
|
||
We’re looking for an experienced data analyst to help us make sense of our direct mail campaign data. We need a skilled professional to analyze our data and provide actionable insights.
We need answers to critical questions, including: What conclusions can we draw from our current data? How should we be running tests to optimize performance? How do we determine if a test is successful? Do we have enough data to make statistically sound decisions? How do we account for seasonality impacts in our analysis?
Skills: Deep Neural Network, Convolutional Neural Network, Python, Tableau, Data Science, TensorFlow, Computer Vision, Keras, Machine Learning, SQL, Microsoft Excel, Big Data, Data Analysis, Artificial Intelligence, Data Visualization
Budget:
not specified
9 hours ago
|
|||||
Volkaria
|
100 USD | 9 hours ago |
Client Rank
- Risky
|
||
Hi, i want to create a minecraft server with MCP 1.12, i need the MCP and id like to create a launcher to launch it
- Launcher - Web Auth on the launcher (i already have the website) - Create a MCP 1.12.2 with my 4 custom mineral (only 4 for now)
Skills: Linux, Java, phpMyAdmin, Minecraft, JetBrains IntelliJ IDEA, Plugin Development, SQL, Redis, Network Engineering, DNS, Swing, Gradle, Apache Maven, System Administration, Proxmox VE
Fixed budget:
100 USD
9 hours ago
|
|||||
Tableau to Web Dashboard Conversion
|
not specified | 9 hours ago |
Client Rank
- Risky
|
||
Only freelancers located in the U.S. may apply.
Project Overview:
We are looking for a skilled freelance developer to convert approximately 20 Tableau dashboards into hard-coded web-based dashboards. These dashboards will be integrated into our existing web platform, which supports custom dashboards via SQL queries and displays data using HTML, JavaScript, CSS, and HighCharts. Key Responsibilities: Recreate Tableau dashboards as static web-based dashboards. Use SQL queries to retrieve data and integrate it into the dashboards. (We can support this) Develop dashboards using HTML, JavaScript, CSS, and HighCharts. Ensure the code is modular, well-commented, and structured for easy customization (e.g., changing database names and tags for different customers). Collaborate with our team to ensure seamless integration into our platform. Requirements: Strong experience in front-end web development (HTML, JavaScript, CSS). Proficiency in SQL for querying data sources. Familiarity with HighCharts or similar charting libraries. Experience with Tableau or translating Tableau dashboards into web-based formats. Ability to write clean, well-documented, and easily adaptable code. Strong problem-solving skills and ability to work independently. Nice-to-Have: Experience working with data visualization projects. Understanding of back-end technologies (optional, but helpful).
Skills: HTML, CSS, JavaScript, Front-End Development, Back-End Development, Web Development, UX & UI, PostgreSQL, MySQL, Tableau, SQL, Highcharts
Budget:
not specified
9 hours ago
|
|||||
Maintenance Software Developer for Security Updates and Data Backup
|
500 USD | 9 hours ago |
Client Rank
- Excellent
$30 648 total spent
14 hires
22 jobs posted
64% hire rate,
open job
5.00
of 4 reviews
|
||
We are seeking a reliable Software Developer to perform up to five maintenance hours per month. The primary responsibilities include applying security patches, performing updates, and verifying data backups to ensure system integrity and security. This role requires attention to detail and a proactive approach to maintaining our software environment. If you have experience in software maintenance and a solid understanding of security protocols, we invite you to apply.
Provider will perform maintenance and support services for the Learning Management System (LMS). 2. Scope of Services Developer agrees to perform the following maintenance tasks for the LMS: Software Updates and Patches The Software Developer will perform a maximum of five (5) maintenance hours per month to: Apply security patches and updates to ensure the system remains secure. Install patches provided by third-party systems integrated with the LMS. Resolve minor bugs and issues affecting system stability. Ensure system performance is not degraded after updates. System Health Checks Monthly reviews of system logs for errors and warnings. Performance monitoring to detect and address potential bottlenecks. Server uptime and response time checks. Data Backup Verification Ensure that automated data backups are performed correctly and verify backup integrity. Assist in data restoration in case of a system failure.) Incident Response Respond to critical LMS downtime issues within [2] hours during business days. Provide troubleshooting assistance and minor bug fixes within the allotted maintenance hours. Exclusions The following services are not covered under this Agreement: Development of new features or customizations. Major LMS upgrades requiring more than the allocated maintenance hours. Recovery from issues caused by unauthorized modifications by the Client. Training or user support beyond email responses. Third-party software licensing or subscription costs.
Skills: PHP, MySQL, WordPress, Web Development, SQL
Fixed budget:
500 USD
9 hours ago
|
|||||
Data analyst to integrate multiple data sources into single database
|
14 - 30 USD
/ hr
|
9 hours ago |
Client Rank
- Good
$7 343 total spent
9 hires
1 jobs posted
100% hire rate,
open job
|
||
**Job Opportunity: Part-Time Data Analyst Specialist**
We are looking for an expert part-time Data Collection Specialist to support us in various data scraping projects for the upcoming month. The perfect candidate will possess a strong background in a) combining and aggregating multiple data sources, b) can transform text into structured data, c) extracting information from various online sources. Your primary responsibilities will include efficiently collecting precise data while maintaining its accuracy and reliability. If you have excellent attention to detail and are skilled in data scraping methodologies, we invite you to apply and join our team!
Skills: Data Extraction, Accuracy Verification, Data Entry, Database, SQL
Hourly rate:
14 - 30 USD
9 hours ago
|
|||||
Data Scientist / ML Engineer
|
20 - 40 USD
/ hr
|
8 hours ago |
Client Rank
- Medium
|
||
Duration: up to 50 hours with possible prolongation
Responsibilities: - Own development data science tools and products (e.g., from last mile data engineering to in-dev or production ML models, Tableau dashboards, chatbot agents, etc.) ensuring their reliability, scalability, and business relevance - Design and implement data science solutions such as predictive models, NLP, and GenAI applications - Collaborate with cross-functional stakeholders to understand their needs and perspectives and drive technical development toward meeting those needs - Support/contribute to the planning of data science initiatives and contribute to the development of business cases (business case ownership will be elsewhere) Requirements and skills: - Full Stack skills with a focus on modeling, with shading towards ML Eng / MLOps - Experience in using Traditional ML (predictive models, NLP, light black-box optimization (i.e. ORtools, CPLEX) - Experience with AI and GenAI (chatbot agents) - Solid Experience in Python and SQL - Expertise with platforms like Dataiku, Snowflake, Tableau, Gitlab, Streamlit, Artifactory, DBT, and working from a remote Linux development workstation - Expertise with cloud services such as AWS - Expertise in ML frameworks is preferred (e.g., TensorFlow, PyTorch/Lightning), MLOps technologies (e.g., Weights & Biases, AWS Sagemaker, Ray), and job scheduling frameworks (e.g., Slurm, AWS Step Functions)
Skills: Machine Learning, Data Science, Artificial Intelligence, Deep Learning, Python, deeplearn.js
Hourly rate:
20 - 40 USD
8 hours ago
|
|||||
Aid and developer of computer vision model
|
not specified | 8 hours ago |
Client Rank
- Risky
|
||
Only freelancers located in the U.S. may apply.
I have a start-up and I'm stuck on creating and computer vision model to detect certain objects or features in an image for our company. I have tried to create the model on my own and I have hit a wall. I would like to hear a professional opinion about my dataset (size, and complexity) and if I need a much bigger dataset. On top of that I have all the masks, coco annotations, all the files to create the masks, training scripts with detectron2, I have a lot. Also I have a 4090 GPU and wouldn't mind training this model locally as well. This job might be pretty easy because i've done the hard work or it needs scaled and more work before getting a model that is accurate enough to confidently deploy.
We are seeking a highly skilled Computer Vision Engineer to join our innovative startup,. Our mission is to revolutionize a process by developing a cutting-edge computer vision model. This project presents a unique opportunity to develop a solution that competes with offerings from billion-dollar companies, positioning our startup at the forefront of the industry. The potential for significant financial success and substantial impact is immense. We invite passionate and talented individuals to join us on this journey to transform the landscape of a certain industry.
Skills: Cloud Computing, Python, SQL, Project Management, Business Strategy, Presentation Design, Requirements Specification, Machine Learning, Cloud Database, Data Analytics, PyTorch, Data Integration, Computer Vision, Data Analysis, Artificial Intelligence
Budget:
not specified
8 hours ago
|
|||||
Data Integration & Reporting Expert (Microsoft Dynamics GP + Tableau)
|
60 - 125 USD
/ hr
|
8 hours ago |
Client Rank
- Excellent
$56 175 total spent
5 hires
6 jobs posted
83% hire rate,
open job
5.00
of 3 reviews
|
||
Featured
Only freelancers located in the U.S. may apply.
Manufacturing company seeking a highly skilled freelancer to help us unlock, analyze, and visualize operational data from Microsoft Dynamics GP (Great Plains). The goal is to mine actionable data, create useful reports, and establish an automated pipeline to Tableau for performance monitoring and analysis.
Key Responsibilities: Extract and organize data from Microsoft Dynamics GP (SQL Server-based) Build and optimize custom queries and views for reporting purposes Set up automated data feeds or ETL pipelines from GP to Tableau Create dashboards or data models in Tableau that track performance metrics Ensure security, accuracy, and reliability of the data integration process Collaborate with our internal accounting and operations teams Ideal Experience: Proven expertise with Microsoft Dynamics GP database structure Strong SQL skills and understanding of data modeling Experience with Tableau or other BI tools (ETL automation, dashboards) Familiarity with accounting and operational data in manufacturing/distribution environments is a plus Experience with tools like Power Automate, SSIS, or custom scripting (Python/VBA) for automation is a bonus Engagement Details: Initial project-based scope, with potential for ongoing support Remote work is acceptable, but onsite in early stages in required. Central time hours preferred. We’re ready to move quickly and can provide access to our systems and team support.
Skills: Operations Analytics, SQL, Data Visualization, Tableau, Microsoft Power BI, SQL Server Integration Services, Microsoft Excel, Data Analysis, Microsoft Dynamics GP, Finance & Accounting
Hourly rate:
60 - 125 USD
8 hours ago
|
|||||
Senior Data Engineer Consultant Needed for DBT and Snowflake Projects
|
30 USD | 8 hours ago |
Client Rank
- Good
$493 total spent
8 hires
7 jobs posted
100% hire rate,
open job
5.00
of 6 reviews
|
||
We are seeking an experienced Senior Data Engineer to provide consulting services on our data pipeline projects. The ideal candidate will have a strong background in using DBT for data transformation, hands-on experience with Snowflake for data warehousing, and proficiency in ETL processes. You will collaborate closely with our data team to optimize data workflows and enhance our data infrastructure. If you are passionate about data engineering and have a proven track record in similar roles, we would love to hear from you!
Skills: Data Engineering, Data Preprocessing, Database Design, Data Integration, ETL Pipeline, Python, Data Science, SQL, Tableau, Data Modeling, Data Analysis, Big Data, R
Fixed budget:
30 USD
8 hours ago
|
|||||
NextJS Training Required
|
20 USD | 7 hours ago |
Client Rank
- Medium
$80 total spent
9 hires
24 jobs posted
38% hire rate,
open job
5.00
of 3 reviews
|
||
Hello,
I am looking for an expert in Next.js who can teach me Next.js. I know HTML, CSS, JavaScript, SQL, and some basics of React.Js, a little bit of Node and Express as well. I also know OOP programming in Java. So I do have some programming concepts so it should be a very long training.
Skills: Next.js
Fixed budget:
20 USD
7 hours ago
|
|||||
VBA programming short freelance job
|
not specified | 7 hours ago |
Client Rank
- Medium
1 jobs posted
open job
|
||
Hello to all readers,
I need a VBA programmer for a short "questions solving stats" to be shown in a dashboard in excel. The file and dashboard requirements are as follows: Question results are appended to a spreadsheet called "test results" inside an excel file. Each line has the following attributes: Test | Question number | Answer | Correct Answer | Question Type | Question SubType | Difficulty | Time | Date | Correct (Yes\No) What happens in the last column (Correct): columns "Answer" and "Correct Answer" are compared for each row. if they are equal the test taker was correct, and the value "yes" is appended to the row under the "Correct" column, if not equal, the value "No" is appended. What needs to happen in the adjacent spreadsheet by use of a macro VBA script: in the adjacent spreadsheet there is a table called Simulation tests with two columns - Simulation number, and simulation date. such as: Sim1 - date1 Sim2 - date2 Sim3 - date3 What needs to happen: All questions solved between Date1 and Date2 are to be analyzed to give an analysis of Performance in percentage which can be dissected by Question Type, Question SubType, and Difficulty (easy/medium/hard), and a calculation of average time to solve for each combination, such as: Question Type: Quant Question SubType: Averages (or multiple selection of Question Subtypes) Difficulty: Medium So essentially, all questions solved between Date1 and Date2 that have Quant, Averages and Medium, what is the percentage of correct questions, and what was the average time to solve? that's the question that the Performance spreadsheet has to answer. then: a trendline needs to be created in order to pinpoint subjects that got improved between two consecutive simulations or have gotten worse in performance. What do I mean: Between Date1 and Date2 - a bunch of questions that are (Quant, Averages, Medium) were solved. Between Date2 and Date3 - another bunch of questions that are (Quant, Averages, Medium) were solved. is there a performance improvement between these two batches of questions or not? Form a graph with a trend line (or some other efficient way to exhibit that kind of data) of that combination (Quant, Averages, Medium) to track performance for that combination between each two consecutive simulations. That has to be done for each possible combination that exists in the test results spreadsheet. Another case to consider: say questions for the combination (Quant, Averages, Medium) were solved between Date1 and Date2, were not solved between Date2 and Date3, but were solved between Date3 and Date4, So the performance of questions solved for that combination between Date1, Date2 range, is to be compared to the performance of questions solved for that combination in the Date3,Date4 range. Track performance between simulations even if those subjects were not practiced consecutively between simulations - that's the point here. There are up to 10 simulations in the simulation table. Data has to be appended such that data about performance will not be overwritten, or deleted and then written. If there were only two simulations dates in the simulation table (only two simulations performed so far), then the performance spreadsheet will exhibit the stats about questions that were solved between those two dates, without comparing them to questions that were solved between Date2 and Date3, because Simulation3 was not solved yet. The performance spreadsheet needs to create insights about performances in the different difficulty levels and by Question Types and SubTypes. Question Type (3 values) and Question SubTypes (many values) - all their values are to be extracted from the test results spreadsheets because there are many, so no dictionary was created to list all of them. All of this is a lot simpler than it looks on paper just a lot to write down, it should be very easy for an experienced VBA programmer. I have a very initial script and an excel file I can share with you to illustrate the job further. attached. I expect this to be delivered under 2 hours of work. Does it sound like something you can do?
Skills: Microsoft Power BI Data Visualization, Power Query, Microsoft Excel, Google Sheets, SQL, Microsoft Power BI, Excel Macros, Visual Basic for Applications
Budget:
not specified
7 hours ago
|
|||||
Full Stack Developer (React, Node.JS, SQL, & LLMs) – AI Startup
|
3,500 USD | 7 hours ago |
Client Rank
- Excellent
$43 409 total spent
11 hires
4 jobs posted
100% hire rate,
open job
5.00
of 7 reviews
|
||
We’re an AI platform looking for a skilled Full Stack Software Engineer to own the remaining development tasks leading up to launch. We need someone who can write clean, efficient, scalable code that aligns with best practices.
If you thrive in fast-moving environments, love problem-solving, and get excited about working with cutting-edge technologies, this role is for you. 🌟 What You’ll Do Take ownership of engineering tasks with hands-on coding. Build and optimize website features Identify and implement ways to streamline development and reduce timelines. Troubleshoot issues, optimize performance, and ensure a seamless user experience. Stay ahead of the curve—exploring and integrating new, cutting-edge tools where they make sense. Collaborate directly with the CEO, providing recommendations and taking initiative. 🔍 What We’re Looking For Technical expertise in React, Node.js, MySQL. Extreme attention to detail—you write high-quality, maintainable code that follows best practices. Curiosity & adaptability—excited to experiment, iterate, and embrace new technologies. Problem-solver mindset—able to work independently, troubleshoot effectively, and find solutions with minimal direction. Strong communicator—comfortable sharing opinions, making recommendations, and collaborating. Speed and efficiency—you know how to optimize workflows and cut down development time without sacrificing quality. 💡 Why Us? Impact: You’ll be instrumental in shaping a disruptive AI-driven platform. Collaboration: Work directly with the CEO—your voice and expertise will be valued. Innovation: Play with the latest tech and AI to solve real problems. Building: Build something that genuinely improves how people experience the world. Future Opportunities: Potential for a larger role post-launch if strong fit. Ready to build something exciting? Let’s create something game-changing together.
Skills: SQL, Web Application, AI Development, AI Model Integration, Database Architecture, React, Node.js
Fixed budget:
3,500 USD
7 hours ago
|
|||||
Senior Laravel Developer
|
15 - 25 USD
/ hr
|
7 hours ago |
Client Rank
- Excellent
$15 078 total spent
47 hires
142 jobs posted
33% hire rate,
open job
4.90
of 30 reviews
|
||
Looking for a Senior Laravel Developer with 6 to 8 Years of hands-on experience in Laravel development. Online availability 9am - 1pm US EST
In this role, you will design and create projects using Laravel framework and PHP, and assist the team in delivering high-quality web applications, services, and tools for our business. To ensure success as a Laravel developer you should be adept at utilizing Laravel's GUI and be able to design a PHP application from start to finish. A top-notch Laravel developer will be able to leverage their expertise and experience of the framework to independently produce complete solutions in a short turnaround time. Requirements: A degree in programming, computer science, or a related field. Experience in Frontend technologies like VueJS, AngularJS and NodeJS Must have working knowledge in CSS 3.0 Experience working with PHP, performing unit testing, and managing APIs such as REST. A solid understanding of application design using Laravel. Knowledge of database design and querying using SQL. Proficiency in HTML and JavaScript. Practical experience using the MVC architecture. Problem-solving skills and critical mindset. Great communication skills. The desire and ability to learn Key Skills: Laravel, NodeJS, VueJS, AngularJS, Livewire, Tailwind, Git
Skills: Web Application, Laravel, CSS, RESTful API, JavaScript, PHP, API
Hourly rate:
15 - 25 USD
7 hours ago
|
|||||
Remote Drupal Developer (id: 63512)
|
not specified | 6 hours ago |
Client Rank
- Medium
11 jobs posted
open job
|
||
Upwork Enterprise Client
title: Remote Drupal Developer (id: 63512)
description: Location: Lemont, IL, USA - 100% Remote Contract type: Contract Contract duration: Yearly Salary: 63 USD per hour About the role Join a leading tech company as a Remote Drupal Developer where you'll be at the forefront of web content management systems innovation. You will engage with cutting-edge technologies and contribute to exciting projects that enhance digital experiences. Enjoy the flexibility of working fully remote while collaborating with a dynamic and experienced team. Responsibilities - Conduct investigation and analysis of business requirements to establish objectives for computerized solutions in web content management. - Prepare detailed software development agile artifacts, including Stories and Testing/Scrum Tasks. - Work alongside a scrum team to develop services utilizing an Agile Scrum practice and mindset. - Develop tools, functionality, and components for web content management systems, leveraging community-built functionality as needed. - Provide expertise in authentication and web services integrations. - Mentor team members and assist in strategy development and execution. - Design test plans and deployment procedures for assigned tasks. - Interface with code repository and continuous integration/continuous deployment (CI/CD) pipeline. - Conduct business analysis, create workflow diagrams, and implement workflows. Requirements - Must reside within the United States. - Minimum of 8 years of experience as a Drupal Developer. - Proficiency in PHP, JavaScript, HTML/CSS. - Experience with database development using MySQL, SQL Server, or Oracle. Preferred skills - Experience working with SharePoint, Enterprise Search (ex. Coveo), and WordPress. Disclaimer Consideration for this opportunity will involve some services from AI career agents provided by Job Protocol Labs LTD. By applying, you agree to the Job Protocol Labs Ltd Terms of Service (https://hunterscouts.com/terms-of-service) and Privacy Policy (https://hunterscouts.com/privacy) and you consent to receiving communications from Job Protocol Labs Ltd via email, SMS, and WhatApp. Message frequency may vary, and message and data rates may apply.
Skills: Drupal, PHP, JavaScript, CSS, MySQL
Budget:
not specified
6 hours ago
|
|||||
AWS Data Engineer (id: 63521)
|
not specified | 6 hours ago |
Client Rank
- Medium
11 jobs posted
open job
|
||
Upwork Enterprise Client
Only freelancers located in the U.S. may apply.
title: AWS Data Engineer (id: 63521)
Location:Phoenix, AZ, USA - Remote Contract type: Contract Contract duration: 3+ Months Contract to hire About the role Join a thriving tech firm as an AWS Data Engineer where you’ll have the opportunity to work with cutting-edge AWS technologies to drive data-driven insights and solutions. This exciting role involves working remotely on diverse data projects, utilizing AWS's powerful suite of tools to create impactful data insights that help steer business decisions. Responsibilities - Utilize AWS tools like Amazon Redshift, Amazon Athena, and Amazon Quicksight for data exploration, analysis, and to provide actionable insights. - Develop and deploy machine learning models using AWS SageMaker to generate predictive insights from large datasets. - Design, build, and maintain data pipelines using AWS services, ensuring efficient data transformation and movement. - Implement real-time data processing using AWS Kinesis and Lambda. - Process large-scale datasets using AWS EMR and related tools, leveraging technologies such as Apache Spark and Hadoop. - Design and optimize data warehousing solutions using AWS Redshift for high-performance querying and reporting. - Ensure data quality and governance through validation, cleansing, and checks using AWS services. - Develop interactive dashboards using Amazon QuickSight to allow stakeholders easy access to interpret data. - Collaborate with cross-functional teams to translate business requirements into data-driven solutions. - Monitor and optimize AWS resource utilization for cost-effective data processing. Requirements - Must live in the U.S. - Bachelor's or advanced degree in Computer Science, Data Science, Engineering, or a related field. - Strong proficiency in AWS services (Redshift, Athena, Glue, SageMaker, Kinesis, Lambda, EMR, and QuickSight). - Proven experience in data analysis and engineering with a portfolio showcasing impactful projects. - Proficiency in programming languages like Python, R, or Java; familiarity with Terraform. - Experience with machine learning frameworks/libraries like TensorFlow or Scikit-learn. - Solid understanding of data warehousing concepts with SQL proficiency. - Excellent problem-solving skills in a fast-paced, collaborative environment. - Must be authorized to work in the U.S. without current or future employer sponsorship. Preferred skills - Strong communication skills to translate technical insights into actionable business recommendations. Disclaimer Consideration for this opportunity will involve some services from AI career agents provided by Job Protocol Labs LTD. By applying, you agree to the Job Protocol Labs Ltd Terms of Service (https://hunterscouts.com/terms-of-service) and Privacy Policy (https://hunterscouts.com/privacy) and you consent to receiving communications from Job Protocol Labs Ltd via email, SMS, and WhatApp. Message frequency may vary, and message and data rates may apply.
Skills: Amazon Web Services, Amazon Redshift, Machine Learning Model
Budget:
not specified
6 hours ago
|
|||||
Senior Golang Developer (Full-Time, 40 Hours/Week)
|
25 - 35 USD
/ hr
|
6 hours ago |
Client Rank
- Good
$6 740 total spent
13 hires
21 jobs posted
62% hire rate,
open job
4.49
of 9 reviews
|
||
About Us:
We’re an ambitious and fast-growing tech company building cutting-edge software solutions. Our current focus is a high-performance CRM platform with complex system architecture and scalability at its core. We’re looking for a highly experienced Senior Golang Developer to join our team and help us take our platform to the next level. Position Summary: As a Senior Golang Developer, you will be responsible for designing, building, and maintaining backend services using Go (Golang). You should have a deep understanding of system architecture, APIs, microservices, and cloud environments. We need someone who can think critically, lead development efforts, and solve complex technical challenges. Responsibilities: Develop and maintain scalable, high-performance backend systems using Golang. Collaborate with architects, product managers, and other developers to understand project requirements and system goals. Lead the design and implementation of complex services and features. Write clean, maintainable, and well-tested code. Troubleshoot and debug production issues. Participate in code reviews and mentor junior developers. Help define and implement best practices and coding standards. Contribute to system architecture discussions and strategic decisions. Requirements: 5+ years of experience in software development, with at least 3 years in Golang. Strong understanding of system architecture, cloud services (AWS, GCP, or Azure), and distributed systems. Experience with RESTful APIs, gRPC, and microservices. Familiarity with Docker, Kubernetes, CI/CD pipelines, and Git. Solid knowledge of databases (SQL and NoSQL). Ability to work independently and deliver high-quality work consistently. Excellent communication and problem-solving skills. Nice to Have: Experience with SaaS platforms or CRM systems. Background in DevOps or infrastructure. Contributions to open-source Golang projects. Benefits: Competitive salary. Flexible working hours. Opportunity to work on impactful, innovative projects. Supportive team and a strong development culture. Potential for long-term growth and leadership roles.
Skills: API Development, Full-Stack Development, Golang, API, PostgreSQL, Amazon Web Services, RESTful API, React, JavaScript
Hourly rate:
25 - 35 USD
6 hours ago
|
|||||
Postgres SQL Expert for Application Process Extraction and Data Warehouse Additions
|
not specified | 6 hours ago |
Client Rank
- Excellent
$427 095 total spent
19 hires
24 jobs posted
79% hire rate,
open job
4.87
of 7 reviews
|
||
Featured
## Description:
We’re looking for an experienced Postgres SQL specialist to help us extract and restructure key processes from an existing application into our data warehouse. This is a complex project that involves analyzing application logic, understanding embedded postgres functions, and working with advanced Postgres features. ## What You’ll Do: * Analyze the current application, including SQL queries embedded directly in the codebase (MS.NET codebase). No coding required, but some queries exist in code, some are functions. * Identify and document processes that rely on embedded Postgres functions * Create clear and structured documentation of these processes * Collaborate with our team to redesign these processes for integration into a data warehouse environment * Document and convert complex SQL queries involving Array and JSON column types * Assist in restructuring and migrating the relevant data ## Requirements: * Expert-level proficiency in Postgres SQL * Strong experience working with Array and JSON column types * Familiarity with reading and understanding application code that contains embedded SQL * Experience analyzing and reworking Postgres functions (stored procedures, etc.) * Strong documentation and communication skills * Ability to work independently and provide clear progress updates ## Nice to Have: * Experience with ETL tools or building data pipelines (Airflow, Kestra, Windmill?) * Familiarity with modern data warehouse platforms (AWS Redshift and Athena) * Background in data architecture or analytics engineering ## Project Type: One-time project with the potential for ongoing collaboration ## Work Arrangement: Remote — must be available for regular check-ins and discussions with our team If you thrive on working through deep SQL logic and enjoy translating application-level processes into clean, structured data workflows, we’d love to work with you.
Skills: Data Engineering, Database Architecture, Data Integration, SQL, PostgreSQL, Database Design
Budget:
not specified
6 hours ago
|
|||||
Retool expert low code
|
15 - 65 USD
/ hr
|
6 hours ago |
Client Rank
- Good
$6 287 total spent
9 hires
11 jobs posted
82% hire rate,
open job
5.00
of 5 reviews
|
||
Hi all,
I’m searching a retool expert that he knows to build complex solution platform, fixing and solve bug inside the platform about evry workloads. Searching for someone to work for long term in different project
Skills: SQL, Low Code & RAD Software, Supabase, Front-End Development, Back-End Development, Bubble.io
Hourly rate:
15 - 65 USD
6 hours ago
|
|||||
Data Analyst and Visualization Expert
|
15 - 25 USD
/ hr
|
6 hours ago |
Client Rank
- Good
$5 479 total spent
6 hires
12 jobs posted
50% hire rate,
open job
4.95
of 6 reviews
|
||
We are seeking an experienced and reliable contractor to assist us in setting up a robust data infrastructure, including a data warehouse and data analysis/visualization tools.
The ideal candidate will have expertise in managing data warehousing and PowerBI for analysis and visualization. Responsibilities: 1. Data Warehouse Setup: - Configure and set up the data warehouse. - Ensure efficient data storage, organization, and retrieval. 2. Integration and Data Pipeline: - Establish seamless integration between various data sources and the data warehouse. - Implement effective data pipelines for smooth data flow. 3. Data Analysis and Visualization: - Utilize PowerBI to analyze and visualize data for meaningful insights. - Develop dashboards and reports to facilitate data-driven decision-making. 4. Performance Optimization: - Monitor and optimize the performance of the data warehouse and analysis tools. - Implement best practices for data efficiency and query optimization. Requirements: - Proficiency in PowerBI for data analysis and visualization. - Strong background in designing and implementing data warehouses and ETL processes. - Familiarity with data modeling and database design principles. - Excellent problem-solving and troubleshooting skills. - Ability to work independently and meet deadlines.
Skills: Dashboard, SQL, Data Visualization, Data Analysis, Microsoft Power BI, Business Intelligence, Data Modeling, Python
Hourly rate:
15 - 25 USD
6 hours ago
|
|||||
Clickhouse Database Consultant Needed
|
30 - 65 USD
/ hr
|
5 hours ago |
Client Rank
- Excellent
$19 215 total spent
6 hires
18 jobs posted
33% hire rate,
open job
5.00
of 4 reviews
|
||
We are seeking an experienced Clickhouse database consultant to help us maximize the potential of our Clickhouse setup.
Our environment integrates Clickhouse with Kafka and Apache Superset for data processing and visualization. Your expertise will guide us in optimizing our database performance, enhancing query efficiency, and ensuring smooth integration with our existing data pipeline. If you have a solid background in data engineering and are an expert with ClickHouse, we want to hear from you!
Skills: SQL, ETL, Database Programming, Database Management, Database Design
Hourly rate:
30 - 65 USD
5 hours ago
|
|||||
Necesitamos una integracion Multitenant para una plataforma de pedidos (Delivery)
|
50 USD | 3 hours ago |
Client Rank
- Risky
$45 total spent
4 hires
53 jobs posted
8% hire rate,
open job
0.00
of 1 reviews
|
||
Los detalles estarán dentro de los documentos adjuntos.
Lea toda la información detenidamente antes de postular. Indíquenos el valor exacto del proyecto y el tiempo estimado de desarrollo. En caso de contar con una cotización formal, será altamente valorado.
Skills: HTML, PHP, JavaScript, Web Design, .NET Core, API, SQL Server Integration Services
Fixed budget:
50 USD
3 hours ago
|
|||||
Database Administrator (DBA) - IIoT & Telemetry System Focused
|
not specified | 3 hours ago |
Client Rank
- Risky
|
||
About Us
We're the Digital Solutions Unit of Hitachi Aqua-Tech Engineering Pte, Ltd Singapore on a mission to innovate through cloud-based, industrial IoT software. Our team builds smart, scalable solutions for environmental infrastructure, data collection systems, and remote monitoring solutions. We handle high-volume telemetry data from industrial facilities and are looking for a database expert to help us optimize our data architecture. About the Role We're seeking an experienced Database Administrator to take ownership of our database architecture and operations. This role focuses on managing our high-frequency industrial telemetry data systems. You'll be responsible for designing, implementing, and maintaining highly available, scalable database solutions that can efficiently handle time-series data from our industrial sensors and IoT devices. What You'll Do Design and implement high-performance database architectures for industrial telemetry data Manage and optimize databases, with a focus on time-series data storage solutions Monitor database health, performance, and implement fine-tuning and configuration improvements Execute disaster recovery procedures and ensure database availability during system outages Set up and maintain high-availability cluster environments Build comprehensive database infrastructure including monitoring, automation, logging, and backup systems Perform stress testing on production databases and implement performance optimizations Support developers by resolving data-related technical issues and providing best practices guidance Develop and enforce policies for database maintenance, security, and archiving Plan and allocate system storage for current and future database requirements Handle database upgrades, patches, and migrations as needed What We're Looking For At least 3 years of experience in database administration, with strong expertise in PostgreSQL and MySQL Proven experience handling time-series data and high-ingestion workloads Experience with database performance tuning, troubleshooting, and query optimization Strong knowledge of database security controls and implementation Experience setting up high-availability database environments and disaster recovery systems Familiarity with database monitoring tools and automation systems Ability to work independently and take ownership of the database infrastructure Strong problem-solving skills and attention to detail Experience with InfluxDB, TimescaleDB, or similar time-series database solutions Prior exposure to industrial telemetry, IoT, or sensor data management Experience with AWS RDS or other cloud-based database services Knowledge of data compression and archiving techniques for large datasets Experience with data pipelines and ETL workflows
Skills: Scripting, MongoDB, Automation, Amazon Web Services, Linux System Administration, SQL, Transact-SQL, C#, Python, Database Administration
Budget:
not specified
3 hours ago
|
|||||
Need Odoo CRM configuration help
|
15 - 35 USD
/ hr
|
2 hours ago |
Client Rank
- Excellent
$16 241 total spent
16 hires
28 jobs posted
57% hire rate,
open job
4.75
of 7 reviews
|
||
I have an Odoo community edition hosting my CRM. I have several different kinds of clients comprising the contacts in the CRM. I want to be able to differentiate them by type of business and email and track them separately.
There is already a custom field added for contacts which you can see in the attached screenshot. We might be able to use this and just add some additional options or maybe some sub-options. I'm not sure which is possible. Each contact should be described by one of the following: C3PAO RPO ISO27001 Attorney Recruiter - C2C - W2 And C2C and W2 are sub-options that apply to Recruiter. I have imported a number of contacts with "C3PAO" on the end of the website field. It should be possible to use some SQL on the backend to identify all of those contacts, set the above described field to "C3PAO", and remove the C3PAO string from the website field. I have done similarly for "Attorney". I know this was not a good way to go and I want to fix it properly now. If this goes well, then I have a couple of future small projects similar to this one.
Skills: Odoo, Customer Relationship Management, Python, Odoo Development, CRM Software
Hourly rate:
15 - 35 USD
2 hours ago
|
|||||
Create MySQL db on host and migrate FileMaker 19 data there
|
not specified | 2 hours ago |
Client Rank
- Medium
|
||
A high priority is linkability (HookMark app) to either weblinked data or via Filemaker queries saved as pdfs. I can use FM on one device, but the data needs to be accessible to a SQL GUI. I work with DevonTHINK 3, Mellel, Bookends/Zotero.
My FM data would be migrated to MySQL then accessible through FM ODBC and a MySQL GUI like SQL Pro Studio. My SQL queries are basic joins. If you might be interested, I attach a copy of an older version of the fmdb for your review.
Skills: Claris FileMaker, SQL, PHP, Python, JavaScript, FileMaker WebDirect, API Integration, JSON API, REST API, AppleScript, XML, ODBC, MySQL, Database Design, Database Programming
Budget:
not specified
2 hours ago
|
|||||
Backend Developer (Node.js / Laravel / CakePHP) – Multi-User Admin Panel
|
10 - 50 USD
/ hr
|
1 hour ago |
Client Rank
- Excellent
$22 689 total spent
41 hires
51 jobs posted
80% hire rate,
open job
4.97
of 26 reviews
|
||
We are seeking an experienced Backend Developer to build a multi-user admin panel for our debt tracking application. The admin panel will allow authorized users from different companies to manage and view user debts while maintaining strict access control.
Responsibilities: • Develop a scalable backend for the admin panel using Node.js (Express.js) / Laravel / CakePHP. • Implement role-based access control (RBAC) for multiple users from different companies. • Design and optimize APIs to connect with the mobile app. • Integrate authentication and authorization using JWT/OAuth for secure login. • Develop company-specific dashboards where each company can view only its debtors. • Implement data encryption and other security measures to protect sensitive financial data. • Set up database schema in MySQL / PostgreSQL / MongoDB to store user debts, payments, and company records. • Build reporting & analytics features to generate insights on debts and payments. • Enable admin actions, such as adding/editing debts, managing users, and setting up payment reminders. • Develop notification systems (email/SMS/push notifications) for debt alerts and payment reminders. • Deploy and maintain the backend on AWS, DigitalOcean, or other cloud platforms. • Implement logging and monitoring for performance and security tracking. Requirements: • Strong experience in backend development with Node.js (Express.js) / Laravel / CakePHP. • Proficiency in database design using MySQL / PostgreSQL / MongoDB. • Experience with API development & integration (RESTful APIs, GraphQL is a plus). • Knowledge of JWT/OAuth authentication and role-based access control (RBAC). • Understanding of security best practices (data encryption, SQL injection prevention, etc.). • Experience in building dashboards and admin panels. • Familiarity with cloud deployment (AWS, Firebase, DigitalOcean, etc.). • Experience with payment gateway integration is a plus. • Ability to write clean, efficient, and well-documented code. Preferred Skills: • Experience with React.js / Vue.js / Angular for admin panel frontend development. • Knowledge of microservices architecture. • Experience in containerization (Docker, Kubernetes).
Skills: CakePHP, RESTful API, API, API Development, Laravel, PHP, Node.js, ExpressJS, MySQL
Hourly rate:
10 - 50 USD
1 hour ago
|
|||||
Java fullstack
|
not specified | 1 hour ago |
Client Rank
- Risky
1 jobs posted
open job
|
||
experience in large scale Java development projects for financial services.
In depth knowledge of Spring Boot, Kafka, SQL, Elastic Search or similar cache.
Skills: SQL, Java, Database Design, NoSQL Database, Amazon Web Services, API Development, Node.js, Apache Kafka, Spring Boot, Microservice, AWS Lambda, Docker Compose, Gradle, GitHub
Budget:
not specified
1 hour ago
|
|||||
Fix Database Write Issues in Bolt and Supabase
|
30 USD | 1 hour ago |
Client Rank
- Excellent
$4 935 total spent
21 hires
16 jobs posted
100% hire rate,
open job
5.00
of 16 reviews
|
||
I am seeking an experienced developer to help resolve issues with my website built using Bolt and Supabase. Currently, the application is unable to write or save data to the database, and I need someone who can identify and fix this problem efficiently. The ideal candidate will have experience in troubleshooting database issues, particularly with Supabase, and will be able to implement a solution promptly. If you have the necessary skills and can help me get my app up and running smoothly again, please apply.
Skills: PHP, SQL, MySQL, WordPress, JavaScript
Fixed budget:
30 USD
1 hour ago
|
|||||
Backend Developer (Node.js / Laravel / CakePHP) – Multi-User Admin Panel
|
1,000 USD | 1 hour ago |
Client Rank
- Excellent
$22 689 total spent
41 hires
51 jobs posted
80% hire rate,
open job
4.97
of 26 reviews
|
||
We are seeking an experienced Backend Developer to build a multi-user admin panel for our debt tracking application. The admin panel will allow authorized users from different companies to manage and view user debts while maintaining strict access control.
Responsibilities: • Develop a scalable backend for the admin panel using Node.js (Express.js) / Laravel / CakePHP. • Implement role-based access control (RBAC) for multiple users from different companies. • Design and optimize APIs to connect with the mobile app. • Integrate authentication and authorization using JWT/OAuth for secure login. • Develop company-specific dashboards where each company can view only its debtors. • Implement data encryption and other security measures to protect sensitive financial data. • Set up database schema in MySQL / PostgreSQL / MongoDB to store user debts, payments, and company records. • Build reporting & analytics features to generate insights on debts and payments. • Enable admin actions, such as adding/editing debts, managing users, and setting up payment reminders. • Develop notification systems (email/SMS/push notifications) for debt alerts and payment reminders. • Deploy and maintain the backend on AWS, DigitalOcean, or other cloud platforms. • Implement logging and monitoring for performance and security tracking. Requirements: • Strong experience in backend development with Node.js (Express.js) / Laravel / CakePHP. • Proficiency in database design using MySQL / PostgreSQL / MongoDB. • Experience with API development & integration (RESTful APIs, GraphQL is a plus). • Knowledge of JWT/OAuth authentication and role-based access control (RBAC). • Understanding of security best practices (data encryption, SQL injection prevention, etc.). • Experience in building dashboards and admin panels. • Familiarity with cloud deployment (AWS, Firebase, DigitalOcean, etc.). • Experience with payment gateway integration is a plus. • Ability to write clean, efficient, and well-documented code. Preferred Skills: • Experience with React.js / Vue.js / Angular for admin panel frontend development. • Knowledge of microservices architecture. • Experience in containerization (Docker, Kubernetes).
Skills: CakePHP, RESTful API, API, API Development, Laravel, PHP, Node.js, ExpressJS, MySQL
Fixed budget:
1,000 USD
1 hour ago
|
|||||
Looking for backend dev instructor
|
not specified | 1 hour ago |
Client Rank
- Good
$1 509 total spent
18 hires
39 jobs posted
46% hire rate,
open job
4.92
of 9 reviews
|
||
I'm looking for someone who can provide real-world backend and DevOps tasks for me to complete in a project-based setting. Ideally, you would create a Dockerized project with a clear scope—such as a RESTful API, microservices architecture, or event-driven system—and provide a task document outlining specific objectives.
Examples of backend tasks might include: Designing and building REST or GraphQL APIs using Node.js, Python, or Go Implementing authentication and authorization (e.g., OAuth2, JWT) Integrating with third-party APIs (e.g., Stripe, SendGrid, AWS services) Writing unit/integration tests and setting up CI pipelines DevOps tasks could include: Writing Dockerfiles and docker-compose setups for development and production Creating infrastructure-as-code templates (e.g., using Terraform or AWS CloudFormation) Setting up CI/CD pipelines (e.g., GitHub Actions, GitLab CI, Jenkins) Monitoring and logging with tools like Prometheus, Grafana, ELK stack, etc. Deploying to cloud platforms (AWS, GCP, Azure, etc.) We would meet a few times per week to go over completed tasks, review code, discuss architectural decisions, and share feedback or additional learning resources. The goal is to simulate a real-world team workflow and gain hands-on experience with production-level backend and DevOps work. If you are interested, please let me know your experience as well as a quick outline of the tasks/workflow you can assign. Include the word "orange" so I know you read this. Thanks.
Skills: Java, Database Design, Database Maintenance, SQL, Docker, Kubernetes, AWS Lambda, Terraform
Budget:
not specified
1 hour ago
|
|||||
Build Structured, Queryable Database from PDFs and CSVs for Data Visualization Project
|
8 - 15 USD
/ hr
|
1 hour ago |
Client Rank
- Medium
$42 total spent
2 hires
4 jobs posted
50% hire rate,
open job
|
||
We’re seeking a detail-oriented data engineer or developer with strong experience in data extraction, transformation, and database structuring to help us build a robust, queryable database from a collection of PDFs and CSV files.
The goal is to extract meaningful data from these documents and organize it into a clean, relational database (e.g., SQLite, PostgreSQL, MySQL, or a format easily parsed by Python/SQL-based tools like pandas or SQLAlchemy). This database will ultimately feed into data visualization platforms and be used for deeper analysis. Project Scope: • Extract tabular and structured data from a variety of PDFs (text-based and scanned) and CSV files. • Normalize and clean the data for consistency. • Design and implement a relational database schema that reflects the relationships across datasets. • Store data in a format compatible with Python (pandas) and SQL queries (SQLite/PostgreSQL preferred). • Provide documentation for the schema and data sources. • (Optional but ideal) Include a basic script or notebook showing how to connect and query the database with Python. Skills Required: • Expertise in PDF data extraction (e.g., Tabula, Camelot, PyMuPDF, PDFPlumber) • Strong Python skills (pandas, SQLAlchemy, etc.) • Experience designing relational databases • SQL (PostgreSQL, MySQL, or SQLite) • Familiarity with CSV data cleaning and transformation • Experience with OCR tools like Tesseract for scanned PDFs (nice to have) Deliverables: • Clean and structured database in a format like .db, .sql, or .csv sets with accompanying schema. • Script(s) or notebook(s) to parse and ingest new files into the database. • Documentation of schema, extraction logic, and assumptions made. • Optional: Recommendations for improving the pipeline or optimizing for future scalability. Ideal Candidate: You are someone who takes pride in clean, well-structured data. You’re familiar with the quirks of PDF extraction and enjoy creating systems that make messy data usable. You communicate well, document your process, and can work independently.
Skills: Data Extraction, Screen Scraping, Web Crawling, Web Scraping, SQL, Python, Data Scraping, Data Visualization
Hourly rate:
8 - 15 USD
1 hour ago
|
|||||
Biztalk developer| EDI| .NET Framework | C#| BRE
|
15 - 20 USD
/ hr
|
1 hour ago |
Client Rank
- Risky
|
||
Looking for job support for below stack
.NET, C# SQL Biztalk server - schemas, pipelines, maps, ports, orchestrations EDI ANSI X12
Skills: .NET Framework, C#, SQL, ASP.NET, JSON, BizTalk Server, EDIFACT
Hourly rate:
15 - 20 USD
1 hour ago
|
Streamline your Upwork workflow and boost your earnings with our smart job search and filtering tools. Find better clients and land more contracts.