Getting the top Currency prices in USD with support to limitation and more features
- Getting Started
- Folder Structure
- Install
- Objectives
- Implementation Solution
- How Edge-cases being Handled
- Intended Approach
- Images Work flow
This is an app to get Top assets prices in usd, build in top of 3 independent services
- limitation
- up to date information
- speedy
.
├── ...
├── src # All the three modules being hold there
│ ├── api # Hold the API service
│ ├── db # DB Connection Source point
│ ├── coin_prices_receiver # call from database all the coin prices
│ ├── top_assets_prices_handler # Query the top_assets and geenrate the end-result
│ ├── procuder # Holds the Two Services that gets the data
│ ├── coins_prices
│ ├── db # DB connection
│ ├── model # Hold the struct model for coins prices and some helper methods
│ ├── data_source # Hold the struct model for coins prices and returns the data from the API
│ ├── handler # Saves the API data, to the database
│ ├── top_assets
│ ├── db # DB connection
│ ├── model # Hold the struct model for coins prices and some helper methods
│ ├── data_source # Hold the struct model for coins prices and returns the data from the API
│ ├── handler # Saves the API data, to the database
│
└── ...
- Note, that it may not be the best structure, for goLang.
- To be able to make the project work you need at least, postgres and go compiler
- Then you need to install the dependencies of this application, unfortuante i didnt have the time to find a better way to install the dependcies, I have worked with many programming language, Go Seems to need a bit more work to install dep,
- So i have worked with minimum depends on outside packages
- Any way the project consist of 3 services, so you need to get into each one and use this command
go get .
- Also support docker-compose
- I didn't use .env for simplicity
- Provide HTTP EndPoint, to fetch top assets and its currency
- provide limitation support
- merging the data of two-API's for getting the required information
- up-to date information
- independent three services each for specific tasks
- descriptive readme file
- tests
- docker
- output should be json or CSV
- Simply my plan was to break the task into seprate thee service
- to grape the top assets, run in cron job each minute
- to grepe asset currency in USD, run in cron job each minute
- to push data in a shared database, periodically
- to provide API for the merged data in persistence matter
- there are many solution for such structure, I have gone simple approach
as to save the data, into Shared database, which is "POSTGRES",
with a cron job that runs each 1 minute to the save two services data
so service 1, will push into table : coin_prices
- id int, symbol text, price real
service 2, will push into table : top_assets- id int, symbol text, rank int
the rank and the API provider Page were tricky, you can see the code for more information, but data with limited to max 100 per 1 page, and for rank we need to make it be built in a matter to make it easy to update documents with the new rank in future And then, the API service, consume the data in away in which
- Integration or test not implemented but intended to do
- docker config not tested
- using the go-pg driver, and build it in matter of structuts instead of querying the db directly
- up to date, is being solved by running the two servcies unlimited for each minute
and then
Insert or update the data
- so the cryptocompare API provider gives for limited data, 100 max recored per page
this been solved by running two loops when quering and insertingor update the records for top_assets
- so issue here that once i have made two loops to loop over all of the data one to grape 100 each so tatal 200, i got into a tricky problem because i was saving the rank as with an iterator, so this made an issue which been solved
- so how whould i constract the required data, and make the merge,
this has been done by
- Get all currency prices in array
- Get the limited top_assets Data and order by rank *important > tricky
- for each element inside top_asset, map it to the array of currency prices to get the price, once the record find get out of loop *> Tricky
I believe A better approach would be to
- use queues as something like "Producer and consumer",
- where the producers are two services each push on seprate Queue,
- And the consumer is the transformer or merging service
- this can be done via kafka or rabitmq
- that would end up with stream processor that would listen to both and transform them into a 3rd queue.
- Now your end client only has to listen to the final stream to get updates.
Also the current implementation its not that bad,
but maybe it can be better, if there is a cache storage maybe 'redis', that cache the transformed or merged data,
so instead for each http call i get the data and merge it
check the postman result with each asset contain rank and price
check 200 records exist for the two tables check for each cycle of cron job, updating records if needed, with order and rank in respect