Ben Chuanlong Du's Blog

It is never too late to learn.

Configure Log4J for Spark

Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!

Show Error Messages Only

When you run Spark or PySpark in a Jupyter/Lab notebook, it is recommended that you show ERROR messages only. Otherwise, there might be too much logging information polluting your notebook. You can set the log level of Spark to ERROR using the following line of code.

Ensure Capturing Log of Applications

Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!

Logging is critical for debugging applications. For production applications, it is best to send log information into a file instead of the standard output so that the log information is persisted …

General Tips on Logging

Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!

  1. Many logging libraries support sending logs as emails or as attachments of emails. As a poor man's way of subscribing to errors and warnings (if there's no engineering teams support parsing …

Python Logging Made Stupidly Simple With Loguru

The best logging package for Python!

  1. Note that the default logging level is DEBUG in loguru and it is not allowed to change the logging level of an created logger object in loguru. You can refer to changing-the-level-of-an-existing-handler and Change level of default handler on ways to changing logging level in loguru.

    1. Remove the default logger (with logging level equals DEBUG) and add a new one with the desired logging level.