NAV Navbar
javascript python go

API version: 1.0.0
Cover

API Overview

The CryptoMood API provides you real-time streams and RPC methods for market sentiment and data requests. We are using Google Protocol buffers with GRPC.

This documentation should contain all informations and descriptions of used models/messages with basic usage in supported languages. Every method or request is also described with sample codes and examples. For extensive examples, visit our public repository.

GRPC

If you are not planing to develop your own API calls, feel free to use our fully compliant client libraries below.

Client libraries

Examples

Go examples

Python examples

Python extras

Nodejs examples

Prerequirements

# --------------------------------
# Instructions for Python
# --------------------------------
# Make sure you have Python 3.4 or higher

# Ensure you have `pip` version 9.0.1 or higher: 
python -m pip install --upgrade pip

# Install gRPC, gRPC tools 
python -m pip install grpcio grpcio-tools

# transpile proto file to `*.py` files with
python -m grpc_tools.protoc -I./ --python_out=. --grpc_python_out=. ./types.proto

# This will generate transpiled files into current directory `types_pb2.py` and 'types_pb2_grpc'
# --------------------------------
# Instructions for GoLang
# --------------------------------
GIT_TAG="v1.2.0" # change as needed
go get -d -u github.com/golang/protobuf/protoc-gen-go
git -C "$(go env GOPATH)"/src/github.com/golang/protobuf checkout $GIT_TAG
go install github.com/golang/protobuf/protoc-gen-go
# --------------------------------
# Instructions for NodeJS
# --------------------------------
# Follow https://www.npmjs.com/package/google-protobuf

Protocol Buffers (a.k.a., protobuf) are Google's language-neutral, platform-neutral, extensible mechanism for serializing structured data. You can find protobuf's documentation on the Google Developers site.

This doc contains protobuf installation instructions. To install protobuf, you need to install the protocol compiler (used to compile .proto files) and the protobuf runtime for your chosen programming language.

For more info, follow instructions on Google's protocol buffer website

CryptoMood protobuf introduction

As mentioned above, we provide API calls via protocol buffers in two forms - stream and RPC call

# RPC call
rpc SayHello(HelloRequest) returns (HelloResponse){
}
# Streaming
rpc LotsOfReplies(HelloRequest) returns (stream HelloResponse){
}

Connection & Authentication

// Load the protobuffer definitions
const proto = grpc.loadPackageDefinition(
  protoLoader.loadSync("types.proto", {
    keepCase: true,
    longs: String,
    enums: String,
    defaults: true,
    oneofs: true
  })
);

// Initialize the MessagesProxy service. You have to provide valid host address and valid path to .pem file
const client = new proto.MessagesProxy(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
);
# Create credencials and join channel. You have to provide valid path to .pem file and  valid host address.
creds = grpc.ssl_channel_credentials(open("./cert.pem", 'rb').read())
channel = grpc.secure_channel(SERVER_ADDRESS, creds)
// install protoc [https://github.com/golang/protobuf]
// and protoc-gen-go [https://github.com/golang/protobuf/tree/master/protoc-gen-go]
// transpile proto file to `*.go` file with 
// `protoc -I .. -I $GOPATH/src --go_out=plugins=grpc:./ ../types.proto`
// This will generate transpiled file into current directory. 
// To adhere golang conventions, move it to dir named ie. types.

// Load credentials
creds, err := credentials.NewClientTLSFromFile("./cert.pem", "")

// Dial the server
conn, err := grpc.Dial(Server, grpc.WithTransportCredentials(creds), 
  grpc.WithTimeout(5 * time.Second), grpc.WithBlock())

Each client is granted x.509 certificate to ensure validity of subscriber/requester. These certificates are issued for specific subdomain, where the b2b server is running. We are using dedicated servers for each client, so:

client A will be using server located at <clientA>.api.cryptomood.com, while
client B will be provided server at <clientB>.api.cryptomood.com.

After the successful connection you can use subscribtions as stated in sections below

# Initialize required service and call required method (in this case subscription)
stub = types_pb2_grpc.MessagesProxyStub(channel)
tweet_stream = stub.SubscribeTweet(empty_pb2.Empty())

# Read data indefinitely
for tweet in tweet_stream:
    print(tweet)
// Subscribe to required stream and listen to incoming data
let channel = client.SubscribeTweet();
channel.on("data", function(message) {
  console.log(message);
});
// Initialize required service and call required method (in this case subscription)
proxyClient := types.NewMessagesProxyClient(conn)
sub, err := proxyClient.SubscribeTweet(context.Background(), &empty.Empty{})

// Read data indefinitely

for {
    msg, err := sub.Recv()
    if err == io.EOF {
        continue
    }
    if sub.Context().Err() != nil {
        _ = sub.CloseSend()
        fmt.Println("Closing connection to server")
        break
    }
    fmt.Println(msg.Base.Content, err)
}

Protocol buffers make it easy to implement your custom libraries. You can think of protobuf file as a file with definitions for our api. Each supported language has tools to use this definitions, ie. node.js can directly load *.proto file and use its api. With python, we recommend class generation from this file with the help of grpc tool. In this documentation we are providing basic examples which you can use with little to no effort.

Data calls

Our internal microservices communicate with each other using gRPC technology. The exchange piece of informations between these services are called data models. Those represents entities, which are processed (analyzed) and distributed.

We differ between them based on where they come from:

RPC

Article

Requesting old articles

const client = new proto.HistoricData(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync(CERT_FILE_PATH))
);
client.HistoricArticles({ from: { seconds: 1561400800}, to: { seconds: 1561428800}}, function(err, req) {
  console.log(req)
});
# Example: https://github.com/cryptomood/api/blob/master/python/HistoricData/HistoricArticles/client.py
historic_request_kwargs = {'from': from_time, 'to': to_time}
req = types_pb2.HistoricRequest(**historic_request_kwargs)
article_items = stub.HistoricArticles(req)

for article in article_items.items:
  print(article.base.id, article.base.content)
historicClient := types.NewHistoricDataClient(conn)
historicRequest := &types.HistoricRequest{From: &timestamp.Timestamp{Seconds: 1561400800},
  To: &timestamp.Timestamp{Seconds: 1561428800}}
sub, err := historicClient.HistoricArticles(context.Background(), historicRequest)
if err != nil {
        panic(err)
}
fmt.Println(len(sub.Items))

Subscribing to articles stream

const client = new proto.MessagesProxy(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
);
let channel = client.SubscribeArticle();
channel.on("data", function(message) {
  console.log(message);
});
stub = types_pb2_grpc.MessagesProxyStub(channel)
article_stream = stub.SubscribeArticle(empty_pb2.Empty())
for msg in article_stream:
    print(msg)
proxyClient := types.NewMessagesProxyClient(conn)
sub, _ := proxyClient.SubscribeArticle(context.Background(), &empty.Empty{})
for {
    msg, _ := sub.Recv()
    fmt.Println(msg)
}

Articles are data models representing blog posts, articles and other pieces of information which apears on article websites. You can request old articles using RPC call or subscribe to incoming new articles.

Basic model for News, articles. It's weight depends on Alexa ranks.

Field Type Label Description
base BaseModel
sentiment SentimentModel
named_entities NamedEntitiesModel named entities from content
title_data NamedEntitiesModel named entities from title

Twitter

API calls for twitter tweets subscription.

Subscribing to twitter stream

const client = new proto.MessagesProxy(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
);
let channel = client.SubscribeTweet();
channel.on("data", function(message) {
  console.log(message);
});
stub = types_pb2_grpc.MessagesProxyStub(channel)
tweet_stream = stub.SubscribeTweet(empty_pb2.Empty())
for tweet in tweet_stream:
    print(tweet)
proxyClient := types.NewMessagesProxyClient(conn)
sub, _ := proxyClient.SubscribeTweet(context.Background(), &empty.Empty{})
for {
    msg, _ := sub.Recv()
    fmt.Println(msg)
}

Tweet model

Field Type Label Description
base BaseModel
sentiment SentimentModel
named_entities NamedEntitiesModel
extended_tweet ExtendedTweet data from original tweet

Reddit

API calls for requesting and subscribing to new posts from reddit.

Subscribing to reddit stream

const client = new proto.MessagesProxy(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
);
let channel = client.SubscribeReddit();
channel.on("data", function(message) {
  console.log(message);
});
stub = types_pb2_grpc.MessagesProxyStub(channel)
reddit_stream = stub.SubscribeReddit(empty_pb2.Empty())
for msg in reddit_stream:
    print(msg)
proxyClient := types.NewMessagesProxyClient(conn)
sub, _ := proxyClient.SubscribeReddit(context.Background(), &empty.Empty{})
for {
    msg, _ := sub.Recv()
    fmt.Println(msg)
}
Field Type Label Description
base BaseModel
sentiment SentimentModel
named_entities NamedEntitiesModel named entities from content
title_data NamedEntitiesModel named entities from title
reddit_post RedditPostModel data from original reddit post

Telegram

For subscribing to new messages from telegram.

Subscribing to telegram stream

const client = new proto.MessagesProxy(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
);
let channel = client.SubscribeTelegram();
channel.on("data", function(message) {
  console.log(message);
});
stub = types_pb2_grpc.MessagesProxyStub(channel)
telegram_stream = stub.SubscribeTelegram(empty_pb2.Empty())
for msg in telegram_stream:
    print(msg)
proxyClient := types.NewMessagesProxyClient(conn)
sub, _ := proxyClient.SubscribeTelegram(context.Background(), &empty.Empty{})
for {
    msg, _ := sub.Recv()
    fmt.Println(msg)
}

Message from telegram channel

Weight is calculated from number of members in channel

Field Type Label Description
user_message UserMessage
symbols_backup string repeated symbols loaded from db
channel_id int32 telegram channel ID
channel_subscriber_count int32 telegram channel members
message_id int64 telegram message ID

Bitmex

For subscribing to new messages from bitmex chat.

Subscribing to bitmex stream

const client = new proto.MessagesProxy(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
);
let channel = client.SubscribeBitmex();
channel.on("data", function(message) {
  console.log(message);
});
stub = types_pb2_grpc.MessagesProxyStub(channel)
bitmex_stream = stub.SubscribeBitmex(empty_pb2.Empty())
for msg in bitmex_stream:
    print(msg)
proxyClient := types.NewMessagesProxyClient(conn)
sub, _ := proxyClient.SubscribeBitmex(context.Background(), &empty.Empty{})
for {
    msg, _ := sub.Recv()
    fmt.Println(msg)
}
Field Type Label Description
user_message UserMessage

MessagesProxy service

Service for entries streaming

Method Name Request Type Response Type Description
SubscribeBaseArticle AssetsFilter PublicModel stream
SubscribeBaseTweet AssetsFilter PublicModel stream
SubscribeBaseReddit AssetsFilter PublicModel stream
SubscribeBaseTelegram AssetsFilter PublicModel stream
SubscribeBaseBitmex AssetsFilter PublicModel stream
SubscribeArticle AssetsFilter Article stream
SubscribeTweet AssetsFilter Tweet stream
SubscribeReddit AssetsFilter RedditPost stream
SubscribeTelegram AssetsFilter TelegramUserMessage stream
SubscribeBitmex AssetsFilter BitmexUserMessage stream

Sentiments service

Sentiment message holds informations about aggregated sentiment for spcecific time window (M1,H1) It is emitted after each time windows closes or is updated (for specific asset or resolution). If your application needs to receive sentiment updates for only one specific asset, it needs to be filtered on your side.

Sentiment candles are divided to two basic groups - news sentiment and social sentiment. Their payload is the same.

Method Name Request Type Response Type Description
HistoricSocialSentiment SentimentHistoricRequest AggregationCandle stream
HistoricNewsSentiment SentimentHistoricRequest AggregationCandle stream
SubscribeSocialSentiment AggregationCandleFilter AggregationCandle stream
SubscribeNewsSentiment AggregationCandleFilter AggregationCandle stream

Subscribing to sentiment stream

 const client = new proto.Sentiments(
   SERVER,
   grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
 );
 let channel = client.SubscribeNewsSentiment({ resolution: 'M1', assets_filter: { assets: ['BTC']}});
 channel.on("data", function(message) {
   console.log(message);
 });
 stub = types_pb2_grpc.SentimentsStub(channel)
 sentiment_stream_request_kwargs = {'resolution': 'M1', 'assets_filter': { 'assets': ['BTC']}}
 sentiment_stream = stub.SubscribeNewsSentiment(**sentiment_stream_request_kwargs)
 for sentiment in sentiment_stream:
     print(sentiment)
 proxyClient := types.NewSentimentsClient(conn)
 sub, _ := proxyClient.SubscribeNewsSentiment(context.Background(), &types.AggregationCandleFilter{
   Resolution: "M1",
   AssetsFilter: &types.AssetsFilter{
     Assets: []string{"BTC"},
   },
 })
 for {
     msg, _ := sub.Recv()
     fmt.Println(msg)
 }

AggregationCandle

Candle message holds informations about aggregated sentiment for specific time window. It is emitted for each changed symbol. If your application needs to receive sentiment updates for each asset, it needs to be subscribed repeatedly from your side.

Field Type Label Description
id AggId used for constructing time-based keys
asset string
resolution string
pv int64 counter for positive items
nv int64 counter for negative items
ps double positive sentiment sum
ns double negative sentiment sum
a double aggregated value

AggId

Field Type Label Description
year int32
month int32
day int32
hour int32
minute int32

AggregationCandleFilter

Field Type Label Description
resolution string resolution for candle - M1/H1
assets_filter AssetsFilter

HistoricData

Service for requesting historic data

Method Name Request Type Response Type Description
HistoricBaseTweets HistoricRequest PublicModel stream
HistoricBaseArticles HistoricRequest PublicModel stream
HistoricBaseRedditPosts HistoricRequest PublicModel stream
HistoricTelegramMessages HistoricRequest PublicModel stream
HistoricTweets HistoricRequest Tweet stream
HistoricArticles HistoricRequest Article stream
HistoricRedditPosts HistoricRequest RedditPost stream
HistoricTelegramMessages HistoricRequest TelegramUserMessage stream

Requesting historic tweets

const client = new proto.HistoricData(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
);
client.HistoricTweets({ from: { seconds: 1546300800}, to: { seconds: 1546300800}}, res => { console.log(res)})
historic_request_kwargs = {'from': from_time, 'to': to_time}
req = types_pb2.HistoricRequest(**historic_request_kwargs)
tweet_items = stub.HistoricTweets(req)
historicStub := types.NewHistoricDataClient(conn)
historicRequest := &types.HistoricRequest{From: &timestamp.Timestamp{Seconds: 1546300800}, To: &timestamp.Timestamp{Seconds: 1546300800}}
response := historicStub.HistoricTweets(context.Background(), historicRequest)

Requesting historic articles

const client = new proto.HistoricData(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
);
client.HistoricArticles({ from: { seconds: 1546300800}, to: { seconds: 1546300800}}, res => { console.log(res)})
historic_request_kwargs = {'from': from_time, 'to': to_time}
req = types_pb2.HistoricRequest(**historic_request_kwargs)
article_items = stub.HistoricArticles(req)
historicStub := types.NewHistoricDataClient(conn)
historicRequest := &types.HistoricRequest{From: &timestamp.Timestamp{Seconds: 1546300800}, To: &timestamp.Timestamp{Seconds: 1546300800}}
response := historicStub.HistoricArticles(context.Background(), historicRequest)

Requesting historic reddit posts

const client = new proto.HistoricData(
  SERVER,
  grpc.credentials.createSsl(fs.readFileSync('./cert.pem'))
);
client.HistoricRedditPosts({ from: { seconds: 1546300800}, to: { seconds: 1546300800}}, res => { console.log(res)})
historic_request_kwargs = {'from': from_time, 'to': to_time}
req = types_pb2.HistoricRequest(**historic_request_kwargs)
article_items = stub.HistoricRedditPosts(req)
historicStub := types.NewHistoricDataClient(conn)
historicRequest := &types.HistoricRequest{From: &timestamp.Timestamp{Seconds: 1546300800}, To: &timestamp.Timestamp{Seconds: 1546300800}}
response := historicStub.HistoricRedditPosts(context.Background(), historicRequest)

Service for requesting historic data. Each request requires (at least) specification of the time window.

HistoricRequest

Request for entries.

Field Type Label Description
from google.protobuf.Timestamp unix timestamp for start - included in results (greater or equal)
to google.protobuf.Timestamp unix timestamp for end - excluded from results (
filter AssetsFilter

Additional

Common objects

Asset

represent one asset

Field Type Label Description
symbol string symbol

AssetItems

Field Type Label Description
assets Asset repeated

AssetsFilter

Field Type Label Description
assets string repeated name of the asset - ie. BTC
all_assets bool

BaseModel

Base model for messages or news, contains basic data like title, content, source, published date etc..

Field Type Label Description
id string unique identifier with schema
title string title of article
content string full content stripped of unnecessary characters(js, html tags...)
crawler string
pub_date google.protobuf.Timestamp timestamp representing the datetime, when the article has been published
created google.protobuf.Timestamp timestamp representing acquisition datetime
source string url of article
excerpt string summary provided by the domain
videos string repeated list of video sources
images string repeated list of image sources
links string repeated list of off-page hyperlinks
author string author of article
lang string identified language
weight double importance of the article's creator

Comment

Reddit comment

Field Type Label Description
ID string
name string
permalink string
createdUTC uint64
deleted bool
ups int32
downs int32
likes bool
body string
subreddit string
replies Comment repeated

CommentCountTimeSnapshot

Stores time/count snapshot of Reddit post comment count this is mainly for measuring amount of added comments during period of time

Field Type Label Description
time google.protobuf.Timestamp
comment_count int32

PublicModel

Field Type Label Description
id string unique identifier with schema
title string title of article
content string full content stripped of unnecessary characters(js, html tags...)
pub_date google.protobuf.Timestamp timestamp representing the datetime, when the article has been published
source string url of article
excerpt string summary provided by the domain
videos string repeated list of video sources
images string repeated list of image sources
links string repeated list of off-page hyperlinks
domain string
created google.protobuf.Timestamp timestamp representing acquisition datetime

UserMessage

Basic model for media where the messages are wrote by regular user

Field Type Label Description
base BaseModel
sentiment SentimentModel
named_entities NamedEntitiesModel
user string nickname of user
message string text of message

SentimentModel

Group data that refers to sentiment of message

Field Type Label Description
sentiment double analyzed sentiment <-10, 10>
market_impact double analyzed impact in the respective area

NamedEntitiesModel

Groups all types of named entities we support.

Field Type Label Description
symbols string repeated list of crypto assets
assets NamedEntityOccurrence repeated recognized cryptocurrencies
persons NamedEntityOccurrence repeated recognized persons
companies NamedEntityOccurrence repeated recognized companies
organizations NamedEntityOccurrence repeated recognized organizations
locations NamedEntityOccurrence repeated recognized locations
exchanges NamedEntityOccurrence repeated recognized exchanges
misc NamedEntityOccurrence repeated recognized misc objects
tags string repeated list of assigned tags
asset_mentions NamedEntitiesModel.AssetMentionsEntry repeated mapped asset to its mention count
source_text string cleaned text which uses NER

NamedEntity

Types of named entities

Name Number Description
ASSET_ENTITY 0
PERSON_ENTITY 1
LOCATION_ENTITY 2
COMPANY_ENTITY 3
EXCHANGE_ENTITY 4
MISC_ENTITY 5
ORGANIZATION_ENTITY 6

NamedEntitiesModel.AssetMentionsEntry

Field Type Label Description
key string asset
value int32 count

NamedEntityOccurrence

Occurrence od named entity. contains position, matched text, category

Field Type Label Description
label NamedEntity Represents NamedEntity element
start uint32 Start position of occurrence
end uint32 End position of occurrence
text string Matched text

RedditPostModel

Stores useful data from original reddit post

Field Type Label Description
ID string
createdUTC uint64
title string
URL string
author string
self_text string
name string
permalink string
deleted bool
ups int32
downs int32
likes bool
num_comments int32
score int32
replies Comment repeated list of comments
domain string
SubredditID string unique ID of subreddit
hidden bool
locked bool
thumbnail string
gilded int32
distinguished string
stickied bool
is_reddit_media_domain bool
comment_count_snapshot CommentCountTimeSnapshot repeated time/count snapshot of Reddit post comment count
hot_rate double Reddit post actual hot rate in queue
subreddit string Reference to parent subreddit
is_self bool
nsfw bool

ExtendedTweet

Stores some useful data form original tweet

Field Type Label Description
favourite_count int32
filter_level string
id_str string
in_reply_to_screen_name string
in_reply_to_status_id_str string
in_reply_to_user_id_str string
is_quote_status string
lang string
possibly_sensitive bool
quote_count int32
reply_count int32
retweet_count int32
user_mentions string repeated
source string
author_created_at string
author_default_profile bool
author_default_profile_image bool
author_followers_count int32
author_friends_count int32
author_id_str string
author_lang string
author_location string
author_name string
author_screen_name string
author_profile_image_url_http string
author_statuses_count int32
categories string repeated
truncated bool
full_text string

Scalar Value Types

.proto Type Notes C++ Type Java Type Python Type
double double double float
float float float float
int32 Uses variable-length encoding. Inefficient for encoding negative numbers – if your field is likely to have negative values, use sint32 instead. int32 int int
int64 Uses variable-length encoding. Inefficient for encoding negative numbers – if your field is likely to have negative values, use sint64 instead. int64 long int/long
uint32 Uses variable-length encoding. uint32 int int/long
uint64 Uses variable-length encoding. uint64 long int/long
sint32 Uses variable-length encoding. Signed int value. These more efficiently encode negative numbers than regular int32s. int32 int int
sint64 Uses variable-length encoding. Signed int value. These more efficiently encode negative numbers than regular int64s. int64 long int/long
fixed32 Always four bytes. More efficient than uint32 if values are often greater than 2^28. uint32 int int
fixed64 Always eight bytes. More efficient than uint64 if values are often greater than 2^56. uint64 long int/long
sfixed32 Always four bytes. int32 int int
sfixed64 Always eight bytes. int64 long int/long
bool bool boolean boolean
string A string must always contain UTF-8 encoded or 7-bit ASCII text. string String str/unicode
bytes May contain any arbitrary sequence of bytes. string ByteString str

google.protobuf types

Following types are so called extension/plugins provided by google. We are listing them here as reference.

Field Type Label Description
Timestamp google.protobuf.Timestamp universal timestamp type
Empty google.protobuf.Empty empty message