Use the log4j record problem in the Flink cluster?

I have a program in which Log4j is used to record some information. When I test in IDE/Intellij, the log file can be generated successfully. And when I submit this task to flink independent cluster . I can’t find any corresponding log files in any working node.
In standalone mode, Flink defaults to Its log file is written to FLINK_DIR/log. FLINK_DIR is the home directory of Flink. You can control the logging behavior by changing the file FLINK_DIR/conf/log4j.properties. There, you can also write log4j.appender to the log file. file.file = FILE_PATH change the path.

I have a program in which Log4j is used to log some information. When I test in IDE/Intellij, the log file can be successfully generated. And when I submitted this task to a flink standalone cluster. I cannot find any corresponding log files in any worker nodes.

In standalone mode, Flink writes its log files by default Enter FLINK_DIR/log. FLINK_DIR is the home directory of Flink. You can control the logging behavior by changing the file FLINK_DIR/conf/log4j.properties. There, you can also write to the log file by log4j.appender.file.file= FILE_PATH change path.

WordPress database error: [Table 'yf99682.wp_s6mz6tyggq_comments' doesn't exist]
SELECT SQL_CALC_FOUND_ROWS wp_s6mz6tyggq_comments.comment_ID FROM wp_s6mz6tyggq_comments WHERE ( comment_approved = '1' ) AND comment_post_ID = 2426 ORDER BY wp_s6mz6tyggq_comments.comment_date_gmt ASC, wp_s6mz6tyggq_comments.comment_ID ASC

Leave a Comment

Your email address will not be published.