...
...
| #!/bin/bash | |
| set -euo pipefail | |
| # Init script to send host-level metrics from Databricks clusters to Prometheus/Grafana using Alloy agent. | |
| # | |
| # 1. Add the init script to your cluster(s). | |
| # 2. Add required environment variables in your cluster configuration, e.g.: | |
| # PROMETHEUS_HOST=10.0.0.1:9090 | |
| # DB_CLUSTER_ID=cluster-or-job-name | |
| # 3. Add more labels if needed (job name, etc) |
On a scale from 1 to 10 how good are your data warehousing skills? Want to go above 7/10? This article is for you then.
How good is your SQL? Want to get ready for a job interview asap? This blog post explains the most intricate data warehouse SQL techniques in high detail. I will use BigQuery standard SQL dialect to scribble down a few thoughts on this.
All of these diagrams are dynamically rendered during html display by Github, the images generated from text inside the Github-Flavored Markdown. None are static images. Mermaid support was released for Github on 2022-02-14
Pros & Cons:
Notes:
B-->C[fa:fa-ban forbidden], hyperlink and tooltips) are supported by Github.We will use the maifest way of installing Kubeflow -https://github.com/kubeflow/manifests
Create a Kind cluster with Service Account Signing key for API Server for Kubeflow to work (Istio Needs it) like below
cat <<EOF | kind create cluster --name=kubeflow --kubeconfig /home/alexpunnen/kindclusters/mycluster.yaml --config=-
kind: Cluster
apiVersion: kind.x-k8s.io/v1alpha4
nodes:
- role: control-plane
Percentage:
<img src="https://user-images.githubusercontent.com/16319829/81180309-2b51f000-8fee-11ea-8a78-ddfe8c3412a7.png" width=50% height=50%>
Pixels:
<img src="https://user-images.githubusercontent.com/16319829/81180309-2b51f000-8fee-11ea-8a78-ddfe8c3412a7.png" width="150" height="280">
| Name: ansible-collection-netbox_community-ansible_modules | |
| Version: 0.1.6 | |
| Release: 1%{?dist} | |
| Summary: Miscellaneous modules for Ansible Playbooks | |
| # note: The netbox collection has inconsistent licensing! There's an MIT | |
| # license file but the headers to all of the source code say GPLv3+. So it's | |
| # probably all GPLv3+ (unless the author has permission to get all of the code | |
| # relicensed) | |
| License: MIT and GPLv3+ |
| from neobase import NeoBase | |
| N = NeoBase() | |
| for key in N: | |
| N.set(key, city_codes=set(N.get(key, "city_code_list"))) | |
| for key in sorted(N): | |
| if "A" not in N.get(key, "location_type"): | |
| continue |
See https://livy.apache.org/docs/latest/rest-api.html
Let app.json be the JSON payload that represents the application:
{
"file": "hdfs:///user/user/apps/hello-spark-0.0.1.jar",
"className": "acme.hello_spark.Grep",
"args": ["input/1.txt", "[Mm]agic"],