My News 4.8.2024
âGood Afternoon Classâ
âGood Afternoon Ms. Moreauâ
âWell, it has been a while since I have been Online with My Classesâ
âI Have been extremely busy learning to run a large Ranch in Wyomingâ

âAnd I am quite aware of Your behavior with Your Substitute Teacherâ
âSo, before I dismiss Class Today for You to enjoy the Total Solar Eclipseâ
âAnd also Suspend Todays Detentionâ
"Why I don't know"
âMy Topic Today is Going to be a Doozyâ
âMs. Moreauâ
âYes Rod, of courseâ
âHow do you organize a Solar Eclipse party?â
âHow Rod?â
âYou Planetâ
In My Head
âFuck I have only been gone 1 Month, And I already need another Month offâ
âHilarious Rodâ
âMs. Moreauâ
âYes Willyâ
âWhat did the Sun bring to the Solar Eclipse party?â
âWhatâ
âA light snack!â
âWilly, You are a total !@#$%â
"Ms. Moreauâ
âYes Dickâ
âHow does the Moon cut its hair?â
âHowâ
âEclipse it!â
âDick You are as Stupid as Willy !!â
âOkay class, enough nonsenseâ
âLet Me start My lesson or We will be here Till the next Total Eclipseâ
In My head
âThis is Gods Penalty for all of My transgressions, I assumeâ

âMy topic today deals with Biotechnology, specifically with Toxinsâ
âIt is the use of Ensemble Machine Learning of Gradient Boosting ,XGBoost, LightGBM, CatBoost and Attention-Based CNN-LSTM for Harmful Algal Blooms Forecastingâ
The Entire Class Groans the loudest I have ever heard
âMs. Moreauâ
âYes Peterâ
âHow do You come up with the most boring Shit possible?â
âI am a Sadist, thatâs howâ
âSo, to beginâ
âHarmful Algal Blooms, HABs, are a serious threat to ecosystems and human healthâ
âThe accurate prediction of HABs is crucial for their proactive preparation and managementâ
âWhile mechanism-based numerical modeling, such as the Environmental Fluid Dynamics Code, EFDCâ
âHas been widely used in the past, the recent development of machine learning technology with data-based processing capabilities has opened up new possibilities for HABs predictionâ
âThere has been development and evaluation of two types of machine learning-based models for HABs predictionâ
âGradient Boosting models, XGBoost, LightGBM, CatBoost and attention-based CNN-LSTM modelsâ
âCNN-LSTM architecture involves using Convolutional Neural Network layers for feature extraction on input data coupled with LSTMs, Long Short-Term Memory, to exploit Spatio-Temporal dependenciesâ
"Ms. Moreau"
"Yes Wang"
"Like the Temporal Anomalies in Star Trek"
"Nope, More like the Anomalies in Your Brain"
Class Laughs
"Carrying on"
âUsing Bayesian optimization techniques for hyperparameter tuning, and applied bagging and stacking ensemble techniques to obtain the final prediction resultsâ
"Ms. Moreau"
"Yes Mr. Pecker"
"Bagging and Stacking, HaHaHa !!"
"I am sure You are familiar with That, Mr. Pecker !!"
Class Snickers
"To continue"
âThe final prediction result was derived by applying the optimal hyperparameter and bagging and stacking ensemble techniquesâ
âAnd the applicability of prediction to HABs was evaluatedâ
âMs. Moreauâ
âYes Wangâ
âWhat the Fuckâ
âWell, I told You it was going to be a Doozy Todayâ
âSure, the Fuck is !!â

âMs. Moreauâ
âYes Spurtâ
âWhat do you call a Blonde who dyes Her hair brown?â
âWhat?â
âArtificial Intelligenceâ
âNot too bad Spurtâ
âMs. Moreauâ
âYes Mr. Ploughmanâ
âWhat is an AIâs favorite music?â
âTechno?â
âAlgorhythmsâ
âOkay You Nincompoopsâ
âLet Me Continue My Lessonâ
âMs. Moreauâ
âYes Dickâ
âDo You have too?â
âYepâ
âI said I was Sadisticâ

âWhen predicting HABs with an ensemble techniqueâ
âIt is judged that the overall prediction performance can be improved by complementing the advantages of each model and averaging errors such as overfitting of individual modelsâ
âI will highlight the potential of machine learning-based models for HABs prediction and emphasize the need to incorporate the latest technology into this important fieldâ
âVarious artificial environmental changes caused by continuous human activities, such as the Four Major Rivers Restoration Project"
"Of the Han River, Nakdong River, Geum River and Yeongsan River in South Korea"
"And also Global climate changeâ
âAre changing the aquatic environment and increasing the frequency of harmful algal blooms, HABsâ
âRecently, in the Republic of Korea, the problem of water source management has been raised due to the occurrence of HABs in the water source section every Summerâ
âAnd many damages, such as the de*th of aquatic organisms, are occurringâ
âAs a result, the need to preemptively predict and respond to HABs is emergingâ
âEconomic losses from HABs over the past 30 years have been estimated at USD,121 Millionâ
âThe occurrence, duration, and frequency of HABs are increasing, posing a serious threat to aquatic ecosystemsâ
âThe National Institute of Environmental Research, NIERâ
âIntegrated the water quality forecasting system and the algae warning system in 2020 as a system for managing HABs and provides HABs forecast information to HABs management institutions and the general publicâ
âSo that they can be managed preemptively through HABs forecastingâ
âIt is very important to improve the accuracy of HABs prediction by upgrading the HABs prediction technologyâ
âVarious studies are being conducted to predict HABs as a method for quickly preparing a policy management plan before or when HABs are expected to occurâ
âPrevious studies have focused on improving HABs monitoring technology and raising awarenessâ
âAnd mechanism-based numerical modeling such as the Environmental Fluid Dynamics Code, EFDC, has been considered as an alternative to understanding and mitigating the effects of HABsâ
âRecently, machine learning technology with large data processing capability has been attracting attentionâ
âAnd it is used in various fields such as voice recognition, image analysis, and biological mechanismsâ
"Ms. Moreau"
"Yes Mr. Poke"
"Like Your Biological Mechanisms"
"Mr. Poke, !@#$ off !!"

âAmong various time-series machine learning algorithms, Gradient Boosting and deep learning technologies are being advanced and applied to various topicsâ
âArtificial intelligence (AI) methods make significant contributions to the control of a systemâ
âDetermining the decisions to be made about the system, future strategies, and increasing efficiencyâ
"Ms. Moreau"
"Yes Wang"
"Are You increasingly efficient?"
""Always"
"Now"
âGradient Boosting is generally known to have a higher prediction performance than random forestâ
âSince an ensemble model is constructed using multiple decision trees, it shows high prediction performanceâ
âSince the decision tree learns the model by predicting the residual error of the previous decision treeâ
âIt has an effect of preventing overfittingâ
âRepresentatively, there are eXtreme Gradient Boosting, XGBoostâ
âLight Gradient Boosting Machine, LightGBM and Categorical Boosting, CatBoostâ
âXGBoost, LightGBM, and CatBoost are all machine learning libraries based on the Gradient Boosting algorithmâ
âXGBoost was developed in 2014 and gained popularity as it performed well on large datasets and won many data science competitionsâ
âMs. Moreauâ
âYes Mr. Peckerâ
âWhat do you call it when a Hedge Fund Manager loses His job to a Watson inspired AI built by IBM?â
âIâll biteâ
âIt doesnât matterâ
âWeâll all be laughing too hard to careâ
"HaHaHa"
In My Head
âGod Damn little !@#$%^&*()'sâ

âSo, before the bell ringsâ
âA bit more before I continue on this topic, next Class"
Class groans even louder !!
âOnwardsâ
âSince then, it has been developed by adding various functions such as GPU learning and distributed learning through version updatesâ
âLightGBM, developed by Microsoft in 2017â
âHas faster speed and lower memory usage than XGBoostâ
âAnd is designed to ensure high speed in large data while ensuring high accuracy even in small data samplesâ
âMs Moreauâ
âYes Mr. Johnsonâ
âYou know what is Microsoft?â
âWhatâ
âRodâs Weinerâ
âTell Me something new !!â
Class busts out laughing
âSoâ
âCatBoost was developed by Yahoo in 2017 and has strengths in handling categorical variablesâ
âIt is an optimized algorithm that automatically applies regularization to prevent overfitting and enables fast learning on both CPU and GPUâ
âMs. Moreauâ
âYes Johnâ
âDonât You date Yahooâs in Wyomingâ
In My head
âChrist !!â

âI'll try to continueâ
âResearch on deep learning technology began with the RNN, Recurrent Neural Network modelâ
âWhich was structured to calculate the current output value by considering the previous input valueâ
âThe LSTM, Long Short-Term Memory modelâ
âAnd the GRU, Gated Recurrent Unit, modelâ
âHave also been publishedâ
âThe GRU model, which has a simpler structure than LSTMâ
âIs an improved model using a gate to update the state of a memory cellâ
âWas introduced to solve the problem that the length of the input sequence and the output sequence are differentâ
âThe Seq2Seq, Sequence-to-Sequence modelâ
âUses two RNN models, an encoder and a decoder, respectively, and was introduced to solve this problemâ
âTo overcome the limitations of the RNN modelâ
âWhich uses all information in the input sequence equallyâ
âThe attention mechanism was developed by Bahdanau et alâ
"Ms. Moreau"
"Is He Badder Than You?"
"Most likely not"

âA method of extracting information by focusing only on the necessary part of the input sequence and calculating the output valueâ
The Transformer model, which further develops the attention mechanism into a multi-head attention form, was introduced by Vaswani et alâ
âThe Temporal Convolutional Network, TCN modelâ
âWhich combines a 1D-CNN, Convolutional Neural Networkâ
âWith models such as RNN, LSTM, and GRU, was proposed by Oord et alâ
âIt is a model applied to time-series data prediction using a multi-head attention-based transformer modelâ
"Ms. Moreau"
"Yes Spurt"
"Like in the Movie Transformers"
"Spurt, You are a total Idiot !!"
âSo, in Conclusion for Todayâ
âMs. Moreauâ
âYes Mr. Pokeâ
âShoutout to Jesus This Class is going to endâ
âMr. Poke, You are not the only one â

âSo, Recent research studies on predicting HABs using Gradient Boosting and deep learning techniquesâ
âhave become increasingly prevalent, particularly in the context of time-series data analysisâ
âHABs data, along with various weather and water-quality variables that impact HABsâ
Exhibit a time-series distribution improved the performance of machine learning models for the early warning of HABs using an adaptive synthetic sampling methodâ
âIn a study utilizing the Gradient Boosting technique, it employed gradient-boosted regression trees to predict cynotoxin levelsâ
âThere is also ongoing research employing deep learning techniques, such as the LSTM method, which is particularly effective for time-series analysisâ
âAnd has been widely used for predicting Algaeâ
BRNNG, BRNNG
âWell, that is the end of Class for Todayâ
âTomorrow the Application of and the Study of Toxins in the Waters of South Koreaâ
Even more groaning
âBut first My AI jokes before dismissalâ
âTwo AIs are talkingâ
âOne saysâ
"Do you think we will never be able to compete with humans?"
âThe other repliesâ
"Don't worry, they'll be too busy arguing about which programming language is superior"
âAnd 1 moreâ
âScientists predict human-level artificial intelligence by 2030â
âMaybe sooner if the bar keeps droppingâ
In My head
âLike these !@#$%^& Loser Students !!â
âClass Dismissedâ
âDetention resumes Tomorrowâ
"Though much to Your chagrin, I will not be there, only watch You remotely"
âSo, Enjoy the Solar Eclipseâ
Class quickly closes Their Laptops
In My Head
"Thank God !!"
âAnd I am finally getting back at those little !@#$%^&*()âs !!â
âNow to get totally Eclipsed Myselfâ
âWhereâs that Bottle ?â
âGood Afternoon Ms. Moreauâ
âWell, it has been a while since I have been Online with My Classesâ
âI Have been extremely busy learning to run a large Ranch in Wyomingâ

âAnd I am quite aware of Your behavior with Your Substitute Teacherâ
âSo, before I dismiss Class Today for You to enjoy the Total Solar Eclipseâ
âAnd also Suspend Todays Detentionâ
"Why I don't know"
âMy Topic Today is Going to be a Doozyâ
âMs. Moreauâ
âYes Rod, of courseâ
âHow do you organize a Solar Eclipse party?â
âHow Rod?â
âYou Planetâ
In My Head
âFuck I have only been gone 1 Month, And I already need another Month offâ
âHilarious Rodâ
âMs. Moreauâ
âYes Willyâ
âWhat did the Sun bring to the Solar Eclipse party?â
âWhatâ
âA light snack!â
âWilly, You are a total !@#$%â
"Ms. Moreauâ
âYes Dickâ
âHow does the Moon cut its hair?â
âHowâ
âEclipse it!â
âDick You are as Stupid as Willy !!â
âOkay class, enough nonsenseâ
âLet Me start My lesson or We will be here Till the next Total Eclipseâ
In My head
âThis is Gods Penalty for all of My transgressions, I assumeâ

âMy topic today deals with Biotechnology, specifically with Toxinsâ
âIt is the use of Ensemble Machine Learning of Gradient Boosting ,XGBoost, LightGBM, CatBoost and Attention-Based CNN-LSTM for Harmful Algal Blooms Forecastingâ
The Entire Class Groans the loudest I have ever heard
âMs. Moreauâ
âYes Peterâ
âHow do You come up with the most boring Shit possible?â
âI am a Sadist, thatâs howâ
âSo, to beginâ
âHarmful Algal Blooms, HABs, are a serious threat to ecosystems and human healthâ
âThe accurate prediction of HABs is crucial for their proactive preparation and managementâ
âWhile mechanism-based numerical modeling, such as the Environmental Fluid Dynamics Code, EFDCâ
âHas been widely used in the past, the recent development of machine learning technology with data-based processing capabilities has opened up new possibilities for HABs predictionâ
âThere has been development and evaluation of two types of machine learning-based models for HABs predictionâ
âGradient Boosting models, XGBoost, LightGBM, CatBoost and attention-based CNN-LSTM modelsâ
âCNN-LSTM architecture involves using Convolutional Neural Network layers for feature extraction on input data coupled with LSTMs, Long Short-Term Memory, to exploit Spatio-Temporal dependenciesâ
"Ms. Moreau"
"Yes Wang"
"Like the Temporal Anomalies in Star Trek"
"Nope, More like the Anomalies in Your Brain"
Class Laughs
"Carrying on"
âUsing Bayesian optimization techniques for hyperparameter tuning, and applied bagging and stacking ensemble techniques to obtain the final prediction resultsâ
"Ms. Moreau"
"Yes Mr. Pecker"
"Bagging and Stacking, HaHaHa !!"
"I am sure You are familiar with That, Mr. Pecker !!"
Class Snickers
"To continue"
âThe final prediction result was derived by applying the optimal hyperparameter and bagging and stacking ensemble techniquesâ
âAnd the applicability of prediction to HABs was evaluatedâ
âMs. Moreauâ
âYes Wangâ
âWhat the Fuckâ
âWell, I told You it was going to be a Doozy Todayâ
âSure, the Fuck is !!â

âMs. Moreauâ
âYes Spurtâ
âWhat do you call a Blonde who dyes Her hair brown?â
âWhat?â
âArtificial Intelligenceâ
âNot too bad Spurtâ
âMs. Moreauâ
âYes Mr. Ploughmanâ
âWhat is an AIâs favorite music?â
âTechno?â
âAlgorhythmsâ
âOkay You Nincompoopsâ
âLet Me Continue My Lessonâ
âMs. Moreauâ
âYes Dickâ
âDo You have too?â
âYepâ
âI said I was Sadisticâ

âWhen predicting HABs with an ensemble techniqueâ
âIt is judged that the overall prediction performance can be improved by complementing the advantages of each model and averaging errors such as overfitting of individual modelsâ
âI will highlight the potential of machine learning-based models for HABs prediction and emphasize the need to incorporate the latest technology into this important fieldâ
âVarious artificial environmental changes caused by continuous human activities, such as the Four Major Rivers Restoration Project"
"Of the Han River, Nakdong River, Geum River and Yeongsan River in South Korea"
"And also Global climate changeâ
âAre changing the aquatic environment and increasing the frequency of harmful algal blooms, HABsâ
âRecently, in the Republic of Korea, the problem of water source management has been raised due to the occurrence of HABs in the water source section every Summerâ
âAnd many damages, such as the de*th of aquatic organisms, are occurringâ
âAs a result, the need to preemptively predict and respond to HABs is emergingâ
âEconomic losses from HABs over the past 30 years have been estimated at USD,121 Millionâ
âThe occurrence, duration, and frequency of HABs are increasing, posing a serious threat to aquatic ecosystemsâ
âThe National Institute of Environmental Research, NIERâ
âIntegrated the water quality forecasting system and the algae warning system in 2020 as a system for managing HABs and provides HABs forecast information to HABs management institutions and the general publicâ
âSo that they can be managed preemptively through HABs forecastingâ
âIt is very important to improve the accuracy of HABs prediction by upgrading the HABs prediction technologyâ
âVarious studies are being conducted to predict HABs as a method for quickly preparing a policy management plan before or when HABs are expected to occurâ
âPrevious studies have focused on improving HABs monitoring technology and raising awarenessâ
âAnd mechanism-based numerical modeling such as the Environmental Fluid Dynamics Code, EFDC, has been considered as an alternative to understanding and mitigating the effects of HABsâ
âRecently, machine learning technology with large data processing capability has been attracting attentionâ
âAnd it is used in various fields such as voice recognition, image analysis, and biological mechanismsâ
"Ms. Moreau"
"Yes Mr. Poke"
"Like Your Biological Mechanisms"
"Mr. Poke, !@#$ off !!"

âAmong various time-series machine learning algorithms, Gradient Boosting and deep learning technologies are being advanced and applied to various topicsâ
âArtificial intelligence (AI) methods make significant contributions to the control of a systemâ
âDetermining the decisions to be made about the system, future strategies, and increasing efficiencyâ
"Ms. Moreau"
"Yes Wang"
"Are You increasingly efficient?"
""Always"
"Now"
âGradient Boosting is generally known to have a higher prediction performance than random forestâ
âSince an ensemble model is constructed using multiple decision trees, it shows high prediction performanceâ
âSince the decision tree learns the model by predicting the residual error of the previous decision treeâ
âIt has an effect of preventing overfittingâ
âRepresentatively, there are eXtreme Gradient Boosting, XGBoostâ
âLight Gradient Boosting Machine, LightGBM and Categorical Boosting, CatBoostâ
âXGBoost, LightGBM, and CatBoost are all machine learning libraries based on the Gradient Boosting algorithmâ
âXGBoost was developed in 2014 and gained popularity as it performed well on large datasets and won many data science competitionsâ
âMs. Moreauâ
âYes Mr. Peckerâ
âWhat do you call it when a Hedge Fund Manager loses His job to a Watson inspired AI built by IBM?â
âIâll biteâ
âIt doesnât matterâ
âWeâll all be laughing too hard to careâ
"HaHaHa"
In My Head
âGod Damn little !@#$%^&*()'sâ

âSo, before the bell ringsâ
âA bit more before I continue on this topic, next Class"
Class groans even louder !!
âOnwardsâ
âSince then, it has been developed by adding various functions such as GPU learning and distributed learning through version updatesâ
âLightGBM, developed by Microsoft in 2017â
âHas faster speed and lower memory usage than XGBoostâ
âAnd is designed to ensure high speed in large data while ensuring high accuracy even in small data samplesâ
âMs Moreauâ
âYes Mr. Johnsonâ
âYou know what is Microsoft?â
âWhatâ
âRodâs Weinerâ
âTell Me something new !!â
Class busts out laughing
âSoâ
âCatBoost was developed by Yahoo in 2017 and has strengths in handling categorical variablesâ
âIt is an optimized algorithm that automatically applies regularization to prevent overfitting and enables fast learning on both CPU and GPUâ
âMs. Moreauâ
âYes Johnâ
âDonât You date Yahooâs in Wyomingâ
In My head
âChrist !!â

âI'll try to continueâ
âResearch on deep learning technology began with the RNN, Recurrent Neural Network modelâ
âWhich was structured to calculate the current output value by considering the previous input valueâ
âThe LSTM, Long Short-Term Memory modelâ
âAnd the GRU, Gated Recurrent Unit, modelâ
âHave also been publishedâ
âThe GRU model, which has a simpler structure than LSTMâ
âIs an improved model using a gate to update the state of a memory cellâ
âWas introduced to solve the problem that the length of the input sequence and the output sequence are differentâ
âThe Seq2Seq, Sequence-to-Sequence modelâ
âUses two RNN models, an encoder and a decoder, respectively, and was introduced to solve this problemâ
âTo overcome the limitations of the RNN modelâ
âWhich uses all information in the input sequence equallyâ
âThe attention mechanism was developed by Bahdanau et alâ
"Ms. Moreau"
"Is He Badder Than You?"
"Most likely not"

âA method of extracting information by focusing only on the necessary part of the input sequence and calculating the output valueâ
The Transformer model, which further develops the attention mechanism into a multi-head attention form, was introduced by Vaswani et alâ
âThe Temporal Convolutional Network, TCN modelâ
âWhich combines a 1D-CNN, Convolutional Neural Networkâ
âWith models such as RNN, LSTM, and GRU, was proposed by Oord et alâ
âIt is a model applied to time-series data prediction using a multi-head attention-based transformer modelâ
"Ms. Moreau"
"Yes Spurt"
"Like in the Movie Transformers"
"Spurt, You are a total Idiot !!"
âSo, in Conclusion for Todayâ
âMs. Moreauâ
âYes Mr. Pokeâ
âShoutout to Jesus This Class is going to endâ
âMr. Poke, You are not the only one â

âSo, Recent research studies on predicting HABs using Gradient Boosting and deep learning techniquesâ
âhave become increasingly prevalent, particularly in the context of time-series data analysisâ
âHABs data, along with various weather and water-quality variables that impact HABsâ
Exhibit a time-series distribution improved the performance of machine learning models for the early warning of HABs using an adaptive synthetic sampling methodâ
âIn a study utilizing the Gradient Boosting technique, it employed gradient-boosted regression trees to predict cynotoxin levelsâ
âThere is also ongoing research employing deep learning techniques, such as the LSTM method, which is particularly effective for time-series analysisâ
âAnd has been widely used for predicting Algaeâ
BRNNG, BRNNG
âWell, that is the end of Class for Todayâ
âTomorrow the Application of and the Study of Toxins in the Waters of South Koreaâ
Even more groaning
âBut first My AI jokes before dismissalâ
âTwo AIs are talkingâ
âOne saysâ
"Do you think we will never be able to compete with humans?"
âThe other repliesâ
"Don't worry, they'll be too busy arguing about which programming language is superior"
âAnd 1 moreâ
âScientists predict human-level artificial intelligence by 2030â
âMaybe sooner if the bar keeps droppingâ
In My head
âLike these !@#$%^& Loser Students !!â
âClass Dismissedâ
âDetention resumes Tomorrowâ
"Though much to Your chagrin, I will not be there, only watch You remotely"
âSo, Enjoy the Solar Eclipseâ
Class quickly closes Their Laptops
In My Head
"Thank God !!"
âAnd I am finally getting back at those little !@#$%^&*()âs !!â
âNow to get totally Eclipsed Myselfâ
âWhereâs that Bottle ?â

1 year ago