The best logging package for Python!
Note that the default logging level is
DEBUGin loguru and it is not allowed to change the logging level of an created logger object in loguru. You can refer to changing-the-level-of-an-existing-handler and Change level of default handler on ways to changing logging level in loguru.- Remove the default logger (with logging level equals DEBUG) and add a new one with the desired logging level.
Tips on Fbs
Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!
https://build-system.fman.io/
https://github.com/mherrmann/fbs-tutorial
New Features in Spark 3
AQE (Adaptive Query Execution)¶
To enable AQE,
you have to set spark.sql.adaptive.enabled to true
(using --conf spark.sql.adaptive.enabled=true in spark-submit
or using `spark.config("spark.sql.adaptive,enabled", "true") in Spark/PySpark code.)
Pandas UDFs¶
Pandas UDFs are user defined functions
that are executed by Spark using Arrow
to transfer data to Pandas to work with the data,
which allows vectorized operations.
A Pandas UDF is defined using pandas_udf
Query Pandas Data Frames Using SQL
Use a Class in the Definition of the Class in Python
Comments¶
- As long as the class name is not need at definition time of the class, it is OK to use it.
You cannot use a class in default values of the __init__ function of the class.
Shell in Docker
Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!
Configure the Shell for the RUN Command
https://docs.docker.com/engine/reference/builder/#shell
Configure the Default Shell for Terminals in Docker Containers
Just set the SHELL environment variable in …