Android Kaki

Build beautiful, usable products using required Components for Android.

gloves-android: Utilizing GloVe Phrase Embeddings for NLP In Android | by Shubham Panchal | April 2023

A glimpse of the glove-android-using demo app. The primary and third photos (from L -> R) depict a ‘evaluate phrases’ function that calculates the cosine similarity between two phrases. The second picture reveals embedded era in motion.

glove-android is an Android library that gives a clear interface for embeddings from GloVe, which is already fairly fashionable in NLP purposes. Embeddings can be utilized to measure the semantic similarity between two phrases, since phrases which can be comparable may have their embeddings (multidimensional vectors) nearer collectively.

Presently, the one supported embeddings are GloVe 50D vectors educated on the Wikipedia corpus. The story outlines how builders can add glove-android to their Android initiatives and in addition the internal workings and its limitations. Right here is the GitHub repo ->

Phrase embeddings are top vectors (lists) generated for every phrase contained in an enormous corpus. These vectors are created such that the vectors of two phrases with excessive semantic similarity are positioned shut collectively within the embedding area.

To coach the GloVe mannequin, a co-occurrence matrix is ​​used ijthentry is 1, if ith phrase and jth happen collectively in a sentence.

Illustration of phrase embedding in embedded area. The phrases ‘king’ and ‘queen’ are contextually associated and due to this fact level (virtually) in the identical path, giving a excessive semantic similarity. ‘Ice’ is one other phrase and is close to the opposite two vectors.

The GloVe mannequin is educated in such a approach that comparable phrases, i.e. excessive incidence, are positioned shut to a different phrase. We are able to calculate the cosine of the angle between the dips and, if the worth is near 1, it means semantically associated phrases. a price of -1 describes a excessive diploma of discreteness.

Builders can use the library’s AAR, present in launch a part of warehouse. Obtain the AAR from the most recent launch and put it in app/libs software listing.

Glove-android.aar is positioned in app/libs, which accommodates the applying’s personal libraries.

Subsequent, we have to inform Gradle about this AAR because it needs to be included within the construct. On the module degree construct.gradle file, particularly, in dependencies block, add,

dependencies {
implementation information('libs/glove-android.aar')

Sync Gradle information and construct the mission. You ought to be prepared to make use of glove-android in your mission now. In case you face any issues with the set up, please do Open a problem on the repository.

Embeds are loaded from a file included within the library’s bundle, so there is no API name to fetch them. Embeds are loaded from an H5 file, which takes a while as a result of massive dimension of ~40 MB. To load embeddings in reminiscence we use GloVe.loadEmbeddings technique is a droop perform, and due to this fact wants a CoroutineScope to carry out.

The strategy wants a sort callback (GloveEmbeddings) -> Unit returns an object of the category GloveEmbeddings by way of which builders can entry embeds synchronously.

class MainActivity : ComponentActivity() {
non-public var gloveEmbeddings : GloVe.GloVeEmbeddings? = null

override enjoyable onCreate(savedInstanceState: Bundle?) {

setContent {
// Exercise UI right here

// GloVe.loadEmbeddings is a suspendable perform
// We want a coroutine scope to deal with its execution
// off the primary thread
CoroutineScope( Dispatchers.IO ).launch {
GloVe.loadEmbeddings { it ->
gloveEmbeddings = it



Subsequent, we will use gloveEmbeddings object to retrieve embed for any phrase,

val embedding1 = gloveEmbeddings!!.getEmbedding( "king" )
val embedding2 = gloveEmbeddings!!.getEmbedding( "queen" )
if( embedding1.isNotEmpty() && embedding2.isNotEmpty()) {
consequence = GloVe.evaluate( embedding1 , embedding2 ).toString()

If the embed will not be discovered, getEmbedding the strategy returns an empty float array, so we test for embedding1.isNotEmpty() .

GloVe.evaluate there are two embeddings which can be FloatArray and return the cosine similarity, expressed mathematically as follows,

After going by way of the official GloVe web site, the place the embeddings can be found for obtain as textual content information, we seen the large dimension of these information. Embeds utilized by glove-android , that are 50D (smallest) vectors educated on the 2014 Wikipedia dataset containing 6 billion tokens with a file dimension of 167 MB, will likely be added as-is to the applying’s belongings. Along with file compression, continuous-time retrieval can also be wanted, as looking by way of 6 billion tokens would take a whole lot of time. To unravel these issues, glove-android have acquired the next strategies,

  • Retailer embeddings in H5 format as a multidimensional array
  • Decreased floating level precision: from 32-bit precision to 16-bit precision
  • Retailer the phrase index mapping as a hash desk for near-constant time retrieval. Right here ‘index’ refers back to the embedded place within the multidimensional array.

The H5 format is a extremely environment friendly file format for storing multidimensional arrays. Moreover, the accuracy of embedding is lowered float16 leading to a lot smaller file sizes. This will likely barely have an effect on efficiency as accuracy is lowered.

Embeds are saved in H5 format, however how do we all know that an embed for a specific phrase is at a specific index? We have to keep the phrase index mapping, which is saved as dict in Python. Given a phrase, which is ‘key’, we seek for the corresponding ‘worth’ that represents the embedded index within the 2D array saved in H5. This method offers environment friendly storage and near-constant time retrieval.

import h5py
import numpy as np
import pickle
glove_file = open( "glove.6Bglove.6Bglove.6B.50d.txt" , "r" , encoding="utf-8" )
phrases = {}
embeddings = []
depend = 0
for line in glove_file:
elements = line.strip().cut up()
phrase = elements[0]
embedding = [ float(parts[i]) for i in vary( 1 , 51 ) ]
phrases[ word ] = depend
embeddings.append( embedding )
depend += 1
print( "Phrases processed" , depend )

embeddings = np.array( embeddings )
hf = h5py.File( "glove_vectors_50d.h5" , "w" )
hf.create_dataset( "glove_vectors" , knowledge=np.array( embeddings ).astype( 'float16') )

with open( "glove_words_50d.pkl" , "wb" ) as file:
pickle.dump( phrases , file )

glove-android is a small element that may add an amazing function to Android apps. I hope you may strive it out in your initiatives and share suggestions on the Points or Discussions web page on GitHub. Thanks for studying, and have a pleasant day forward!

John Wick: Chapter 4 (FREE) FULLMOVIE The Super Mario Bros Movie avatar 2 Where To Watch Creed 3 Free At Home Knock at the Cabin (2023) FullMovie Where To Watch Ant-Man 3 and the Wasp: Quantumania Cocaine Bear 2023 (FullMovie) Scream 6 Full Movie
Updated: April 30, 2023 — 6:20 pm

Leave a Reply

Your email address will not be published. Required fields are marked * © 2023 Android kaki