검색 결과가 없습니다.
죄송합니다. 고객님께서 찾고 계신 일자리를 찾지 못하였습니다.
다음에서 최신 일자리를 검색하실 수 있습니다. 다음:
encontre amigos pelo app, fazer tipo um tinder
hello how are u? i have a great job offer for persons who want earn more money this offer is for persons that have many tyoe and translate projects just if u are please bid and waiting to contact to u
Have re created some UI for which there is update in the APIs and needs integration in Unity2d App.
We are looking for an individual that can extract content from a pdf files. The files are entirely written in English . We want to have 100% accurate result. very easy and simple
PDF and JPG files are to be converted in document files by typing manually, there is no need of any experience other than basic knowledge of Microsoft word,all other guidance will be provided through demonstration video the candidate should be fluent in English
We hope that as members progress, more people will follow us on a global scale! (We need valid participant material, which has been shown in the form of a video) The name of our project: FlowCloud FC web version login: [로그인하시면, URL을 확인하실 수 있습니다.] Twitter: [로그인하시면, URL을 확인하실 수 있습니다.] Official website: [로그인하시면, URL을 확인하실 수 있습니다.] Telegraph: [로그인하시면, URL을 확인하실 수 있습니다.]
Native Android app from scratch REST API integration of some stocks and commodities Some custom Calculations on api values Storing the values of API in database Alerts and some misc. things...
I have site in windows server and also have credentials for upload it with sql. you will have to upload site in my ftp credentials from windows server to my credentials. i know this is so simple task so bid accordingly.
Project only for Mihajlo - new drawings
I am looking for somebody who has experience and well skilled in python to develop us a script that can automatically download some files using a date. Required 1. Write a python script that will scrape the India finance data daily. 2. The scraped data is to be stored onto a mysql database daily, each day appending on the next row 3. have a cron schedule to scrape these information daily into a m...