The best logging package for Python!
Note that the default logging level is
DEBUG
in loguru and it is not allowed to change the logging level of an created logger object in loguru. You can refer to changing-the-level-of-an-existing-handler and Change level of default handler on ways to changing logging level in loguru.- Remove the default logger (with logging level equals DEBUG) and add a new one with the desired logging level.
Tips on Fbs
Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!
https://build-system.fman.io/
https://github.com/mherrmann/fbs-tutorial
New Features in Spark 3
AQE (Adaptive Query Execution)¶
To enable AQE,
you have to set spark.sql.adaptive.enabled
to true
(using --conf spark.sql.adaptive.enabled=true
in spark-submit
or using `spark.config("spark.sql.adaptive,enabled", "true") in Spark/PySpark code.)
Pandas UDFs¶
Pandas UDFs are user defined functions
that are executed by Spark using Arrow
to transfer data to Pandas to work with the data,
which allows vectorized operations.
A Pandas UDF is defined using pandas_udf
Query Pandas Data Frames Using SQL
Environment Variables in Shell
export¶
A new child process forked from a parent process does not inherit parent's variables by default. The export command marks an environment variable to be exported with any newly forked child processes and thus it allows a child process to inherit all marked variables.
unset¶
Bash Programming
Environment Variables¶
export
unset
Tips and Traps¶
explainshell.com is a great place for learning shell.
Bash-it/bash-it is a great community driven Bash framework.
It is suggested that you avoid writing complicated Bash scripts. IPython is a much better alternative.
Do NOT use
;
to delimit paths passed to a shell command because;