Skip to content

Instantly share code, notes, and snippets.

View orionpax00's full-sized avatar
🏠
Working from home

Durgesh orionpax00

🏠
Working from home
View GitHub Profile
@orionpax00
orionpax00 / cloudSettings
Last active April 9, 2020 18:36
Visual Studio Code Settings Sync Gist
{"lastUpload":"2020-04-09T18:36:14.879Z","extensionVersion":"v3.4.3"}
@orionpax00
orionpax00 / Description.md
Last active December 9, 2020 04:14
A simple Attention Mechanism for LSTM-CNN Input model🎯

Attention

The Attention mechanism in Deep Learning is based off the concept of directing model's focus, and it pays greater attention to certain factors when processing the data.[1]


from awake.visualise import Plotting
plot1 = Plotting()
plot1.plot(data={x:[],y:[]},type="line",interactivity="low",...)
from awake.visualise import Plotting
plot = Plotting()
plot.plot(data, type, style_info, interactivity, summary, summary_out, ...)
import os
import glob
import pandas as pd
import xml.etree.ElementTree as ET
def xml_to_csv(path):
xml_list = []
for xml_file in glob.glob(path + '/*.xml'):
tree = ET.parse(xml_file)
import '@tensorflow/tf' as tf
async function trainModel(model, inputs, labels) {
// Prepare the model for training.
model.compile({
optimizer: tf.train.adam(),
loss: tf.losses.meanSquaredError,
metrics: ['mse'],
});
const batchSize = 28;