%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% *Mogire Victor Maranga UNITS IMAGE + DETAILS 5(registrar,skul,dep<->examdean,list,financeVICTORCOMPLETESTATUS)-2(scripts,letter) *Shilla 2(scripts,letter) @@CHARITY -> 0721581667 MAISH @ STAMP -> 0721770531 March 100k-> Dan(Reg-Skul-Dep-[1,2,3,4 Lists and Grad List])(IfAccessibleOLDSYSTEM COZ OF DATE IN NEWSYSTEM COMPLETE_UNITS-Use Victor's OLDSYSTEM CRITERIA& COMPLETE_FINANCE-Use Victor's OldSystem CompletionRemarks& Shilla(Reg - Skul-Dep - 4 TScripts) & 5k-> Maish - CSChairStamp For ShillaChairLetter September 50k-> Dn Backend James Go Ahead For October4TScripts + ChairLetter October -> James Go Ahead4TScripts + ChairLetter
OR
[Victor's Resemblance] 4) James Go Ahead -> [4Tscripts & Completion Letter] 3) Dep ->Frontend_CompletedStatus & Backend_CompletedStatus 2) School ->Frontend_CompletedStatus & 1,2,3,4 List_CompletedStatus & Registered for YEAR4 ALL UNITS in [2020_DATE] OLD SYSTEM [Have CompletedStatus] & Finance_CompletedStatus[2020_DATE]
- Registrar ->[Admission+Frontend Units]_CompletedStatus & Grad List_CompletedStatus & Finance_CompletedStatus[2020_DATE]
Got the letter in 2016 I was a Self sponsored student Computer Networks Introduction to Programming and Algorithms
Network Design and Management Software Engineering
Project Management Advanced Database Management Systems
Professional Issues in Information Technology Electronic Commerce
April 2020 Project Presentation
John Computer Networks and Cybersecurity Patrick Java Enterprise Edition Victor Android application
Two Attachments
Master's 23rd April 2020 1st September 2020
2020 1935
2021 25th April 2017 4th September 2017
2022 - fam %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
![pipelines-logo](https://github.com/devopsengineerDan/engineer-dancun-personal-blog/assets/48592378/4eb0599c-7062-4e5a-84cf-b03055e87bf1)
![pipelines-logo](https://github.com/devopsengineerDan/engineer-dancun-personal-blog/assets/48592378/4eb0599c-7062-4e5a-84cf-b03055e87bf1)
![scope_and_context](https://github.com/devopsengineerDan/engineer-dancun-personal-blog/assets/48592378/2ae81780-080a-46d0-a876-f2915d6752c3)
![ml-pipeline-engineering](https://github.com/devopsengineerDan/engineer-dancun-personal-blog/assets/48592378/87d18387-2657-43e9-bf35-03864ac95b4e)
![ci_workflow](https://github.com/devopsengineerDan/engineer-dancun-personal-blog/assets/48592378/5c9e5a25-33ca-4afa-89e3-eb308dc562a5)
![github_pr](https://github.com/devopsengineerDan/engineer-dancun-personal-blog/assets/48592378/7710ef07-91d1-4e55-a9e1-91e76541f3ad)
![bodywork-cli](https://github.com/devopsengineerDan/engineer-dancun-personal-blog/assets/48592378/96dcd641-e202-457f-9cfe-b857196204c7)
![ml-pipeline](https://github.com/devopsengineerDan/engineer-dancun-personal-blog/assets/48592378/7f25fedd-c6bf-4d14-9cfe-fb1c1e1d8d2b)
![etl](https://github.com/devopsengineerDan/engineer-dancun-personal-blog/assets/48592378/5aecd34b-e722-4390-ad77-5ed10aa5f3a7)
===============================================================================================================================================
===> LIFE IS WHAT YOU MAKE OF IT AND [KNOW << AND >> GET IT]
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
Security
Vested interest WAR Wisdom/ Sound Reason/ Illuminated Light/ Accurate Consciousness/ A+ / Science Religion => Dogmatic Supersticion
Incompetence DISEASE Love => Fear
Corruption POVERTY/ VIOLENCE Wisdom/Sound Action => Ignorance of mob/crowd
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
SECURITY
STABILITY
PERFORMANCE
=========================
(A+)Accurate and Conscious && 9, 6 and 3 Dimensional plus Be in Harmony with nature
--------------
Analysis(Simple, Accurate, Categorise) Emotional Intelligence => Ordained
Operations Social Awareness => Chosen
Implementation(Complete) Logical => Elected
--------------
Numbers[Make Huge Gain/Profit] + [Offer Minimal Reward plus Blessings] + [Operate within a BUDGET], (Reduce randomness, Reduce Loss) || (Be aware of threat/Killers, Opportunists, Destroyers)
=========================
SECURITY Limitless
CONSISTENCY Limitless
DiSCIPLINE Limitless
HARDWORK Limitless
-------------
SECURITY Limit
Time within Bound
Way within Sope
Resources within Limit
--------------
SECURITY
Feel(Be Neutral and Have Awareness)
Energy
Vibration
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
---------------
1 2
Information KNOW(GOD + Wealth) Observation
Communication(Space<->Complete) Accurate <-> Feel <-> Behaviour <-> Pattern
PROTECT Life, Liberty, Equality and Property
9 SNAKE + LION => 33 [WE]
@[UP] BULL => WHY
LION + EAGLE => WHY
=> HOW
6 ADAPT
8 REVIEW
=> WHAT [MINE]
3 LION + SNAKE
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
🧨🧨🧨🧨🧨🧨🧨 LIFE CRUCIAL LESSONS 🧨🧨🧨🧨🧨🧨🧨🧨
AWARE ALIVE
INCREASE IN WEALTH A+ [Accurate, (Know your Position and other Parties Positions and Be Alive to Yourself and to all other Parties), Humble and Generous/Kind]
SEX DOPAMINE (CORP+CARE) TIME <=> FEEL
ENTERTAINMENT FEAR (PROSTITUTE) WAY <=> ENERGY
FOOD+WATER DETORIORATE (DRUNKARD) REALISTIC(OWN) <=> VIBRATION
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Portfolio Gain/ Brokerage
Salary
🧨🧨🧨 SHORT/LOSS/STEAL => Result to Destruction then DEATH
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
🧨🧨🧨🧨🧨🧨🧨🧨 BAD SITUATIONS 🧨🧨🧨🧨🧨🧨🧨🧨
Naive
Foolish / Laziness
Ignorance / Stupid
🧨🧨🧨 CON(Naive) - LIE(Foolishness) - THEFT(Ignorance)
🧨🧨🧨 Delay - Excuse - Failure
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Undecisive/ Stagnant => Careful Decision Making to be CORRECT and EFFECTIVE
Chaos => Intelligent Adapting to be BETTER
Rigid/ Brick Wall => Productive while GAINING
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀
🧨🧨🧨 CONSIENTIOUS, CHARISMATIC, MATURE ======> Hold Strong, Move Fast
=========> WORK + GET REWARD => SOLUTION PROVIDER
=========> PURPOSE 🧨🧨🧨 DESICIVE
=========> GROWTH 🧨🧨🧨 ADAPTIVE
=========> LIVELIHOOD / SHELTER 🧨🧨🧨 PRODUCTIVE ACTION
🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀🌀
🧨🧨🧨 NO SPECULATION
🧨🧨🧨 NO RANDOMNESS 🧨🧨🧨 NO LOSING OF MONEY
Opportunist/Take Advantage Steal
=> Little Token => Little Token
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
---------------
1 DAVID 2 DANIEL
Security Security
Consistency Accurate(Simple, Prioritise, Levels - high to low)
Discipline Adaptive within Scope
Hard work(Realistic) Realistic(High performance and Continuity)
---------------
1 FATHER 2 POLITICS 3 POLITICS
Know Character Time => Within time bound
Connect the dots Emotions Way => Within scope
Competitive Logical Realistic => Within required resources
---------------
👉 Sheep<=>Goat*******************************
Humble(Arrogant)
Kind(Mean)
++++++++++++++++++++
Discipline(Not on Track)
Obedient(Disobedient)
++++++++++++++++++++
👉 LISTEN && BE IN HARMONY WITH NATURE*******************************
👉 [ACCEPT+AGREEMENT]=>TRANSACTION*******************************
---------------
🧨 SECURITY
🧨 RESPONSIBLE
🧨 PLAN
🧨 GROWTH
🧨 EXECUTE
---------------
👉 Daniel, [7/13/24 10:42 AM]
3 Britain_USA Goat Laws. China_Russia Sheep Laws . Agreement_Transaction
A+ ----- All
3 Market No Chase, Rush, Force(Try to bend Nail). Leader is BlackRock_Sun and Partial Judge, Class,Tribe/Family. THERE IS NO HEAVEN_(CREATE) HEAVEN AND THERE IS HELL.
***************************************************************************************************************************************************************************************
🌀
Increase in Wealth to achieve Objective
**
% Awareness to achieve Objective
% Actor to achieve Objective
% UP Increase Logical ------ [BEING, IMPROVE, HARMONY WITH NATURE]-Appearance(Objective), Emotions(Equilibrium-Move within Scope), DOWN Increase Logical(Entanglement)
***************************************************************************************************************************************************************************************
🌀
£ UNDERSTAND [System, Policy/Law, Contract, System]
£ CONMAN, LIER, THIEF
£ UNDERSTAND SYSTEM ======= BE IN HARMONY WITH NATURE
£ TRANSFER FUNDS TO PROPERTY MANAGEMENT
***************************************************************************************************************************************************************************************
🌀
%Daniel [Consistency,Discipline, Hardwork], Humble_Kind
%Dad [Survive, Know to Detail, Connect the Dots, Competitive]
%DEATH - SUN *Harmony with nature [within scope and budget] ---§ MOON - DEATH *Add up in nature [within scope and budget]
***************************************************************************************************************************************************************************************
🌀
================> INTERACTION WITH ENVIRONMENT
Story-Telling => Dismiss
Entertaining => Condone
Educational => Exploit
Desire
Dream
Pain <===> Need
===> SURVIVE
===> AWARENESS
Simple => Obvious [Overstand && Know to Detail && Complete Duty]
Complex => Overengineering && EXPERIMENTATION [Adapt, Review, Innovate=Invent]
Complicated => Ignorance && PERSISTENT [Understand]
👉 Daniel, [7/14/24 11:53 AM]
EXTEME==VICE==PUNISHMENT ---> IAMNOT
👉 Daniel, [7/14/24 11:58 AM]
Images Warren Buffet, Bill Gates, Zuckerberg
CR7, Messi
New Zealand Rugby
👉 Daniel, [7/14/24 12:06 PM]
Net worth plan
Banks
Daniel
Family
🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨
🧨 TIME => PURPOSE
🧨 TALENT AND REVIEW => GROWTH => WAY
🧨 PRACTICE TO MAKE PERFECT => PHYSICAL NEED
*******************************************************************************************************
🧨 I -> INFORMATION => OBSERVE, FEEL, LISTEN, TASTE, 6TH SENSE
🧨 S -> SILENCE
🧨 S -> STRATEGIC COMMUNICATION
🧨 H -> HARMONY WITH NATURE
*******************************************************************************************************
👉 GRIT
🧨 SECURITY
🧨 [INNOVATE + INVENT]
🧨 PERFORMANCE
🧨 CREATE 👉 ISSH
🧨 IMPROVE 👉
🧨 MAINTAIN 👉 PERFORMANCE
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
==> SECURITY
==> GOD
==> CONSCIOUSNESS
==> BELIEF
==> 🧨🧨🧨👉👉👉 CONSTITUTION / AGREEMENT
ENERGY
PHYSICAL
------------------------------------------------------------------------------------------------------------------------------------------------------------------
==> MIND
==> SEE / OBSERVE
==> PSYCHOLOGICAL
==> IDEAS
PHYSICAL
------------------------------------------------------------------------------------------------------------------------------------------------------------------
===> SECURITY ===> SECURITY
DIEUS GOD == TIME == WISDOM/ NEUTRAL / APPLICATION OF KNOWLEDGE == TRUTH == HARMONIOUS
EQUILIBRIUM AND == WAY == BEAUTY == LOVE == INTEGRATED
VACUUM
LOGIC MY RIGHT == REALISTIC == RESISTANCE == JUSTICE == INDUSTRY
🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨
MANAGER [BLAZING TORCH, ORDAINED=>Consientious, CHOSEN=>Charismatic, ELECTED=>Mature]
RESPONSIBLE BUSINESS SENSE[Finance,Marketing,Economics] <==> [BUSINESS TIME, BUSINESS ADAPT, BUSINESS PROFIT AND LOSS]
LAW
SILENCE**************
Self Mastery PSYCHOLOGY[MINE+ENVIRONMENT] FEEL <=> TIME [Who am I + What Position am I, Who is the other party + What Position is the other party]
Listen *ALL SENSES* ENERGY <=> WAY Where are we
Write + Communicate VIBRATION <=> REALISTIC What Message[SILENCE + ALLOWANCE FOR ENVIRONMENT + TO RESPOND] + How to Pass Message[ORDERLY] + COMPLETE Message
ADAPT
REVIEW
INNOVATE + INVENT
Performance
Quality IN ALL**************
Stability
🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
--------------------------
👉
INCREASE IN WEALTH + GOD[Accurate, Conscious on Both Parties, Generous]
1 Don't Discuss Income, If so Have it on the LOW.
2 Investor<=>Gain, [Devops Engineer and Scientist] ,Database Engineer.
3 Understand System, Policy, Contract.
*** SILENCE (ALL SENSES)
READ
===> WRITE (ACCURATE, ORDERLY, COMPLETE)
===> SPEAK (ACCURATE, ORDERLY, COMPLETE)
--------------------------
👉
GOVERNANCE=> Cybersecurity Systems, AI Systems
GOLD
BIG CURRENCIES => DOLLAR, POUND
MINERALS => OIL, HIGH VALUE MINERALS
STOCK=> Stake Holders
BUSINESS=> Big Insurances, Big Banks
PROPERTY=> Farming Projects, Academies, [Rentals] Real Estate Properties, [Sell] Real Estate Properties || Lands || Natural Resources
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
👉
1 ===> INCREASE IN WEALTH(High Performance_in_PRODUCTIVITY)
2 ===> LAYERING(High Performance_in_PRODUCTIVITY)
👉
SUPREME
CENTRAL ===> Central Bank
LARGE ===> MARKET MAKER ===> Institutional Investor
MEDIUM ===> MARKET MOVER/ MARKET SHAKER ===> Super Banks
SMALL ===> Retail Investor
👉
3 ===> KNOWN ASSETS(High Performance_in_PRODUCTIVITY)
**********************************************
===> SOURCE -> Already Existing Assets, Loan
===> ADD UP -> Orderly, Realistic
**********************************************
👉
WORK KNOWLEDGE + BUSINESS KNOWLEDGE
WISDOM <==> ACTION
GOOD LUCK
---------------------------
👉
FOREIGN INVESTORS
LOCAL INVESTORS
LOANS
---------------------------
👉
No Knowledge
Don't know what you're saying
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
👉 HEALTHY DIET
[GOD + WEALTH]
Silence + Listen
Speak
Breathe
Water
[Bones soup & Milk] + Vitamins + LESS Carbohydrates + MORE Proteins[Eggs & Beans]
=================================
👉 RELATIONSHIP WITH ENVIRONMENT
[GOD + WEALTH] => SPIRIT/ WILL => SECURITY
SILENCE + LISTEN
SPEAK
SEX[Validation] => PSYCHOLOGY => GREED
ENTERTAINMENT[Love] => ESTEEM => FEAR
FOOD[Gift/Reward] => BACKGROUND => HATE
====================
👉 BEHAVIOUR I HATE
SCARCITY/Moon ABUNDANCE/Sun
=> GOAL ORIENTED
CHAOS + UNDESICIVENESS + IRRATIONAL => ***EXTREME*** DESIGN and DESICIVENESS + RATIONAL 97%[SIMPLE ** CATEGORIZE] && Avoid [Extreme Chaos and Extreme Paralysis plus Speed]
LAZY + STAGNANT => CONTINUOUS PROCESS IMPROVEMENT and GROWTH
MEAN => PRODUCTIVE and RESULT ORIENTED
===================================================================================================================================================================================================================================================
👉 SUMMARY
🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨
POOL
14 1 5
8 11 12 7
6 13 10 9
4 2 15
3 White
🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨
🌀 SECURITY(PERFECTION/COMPLETE) AND SILENCE
1 Cybersecurity🧨🧨🧨
2 Armory
3 Pool table* , Guitar* 🧨🧨 YOUTUBE + SONGSTERR 🧨🧨
4 Chess
🌀 ACCURATE AND PRIORITY(TIME BOUND)
1 Silence, Listening🧨🧨🧨
2 Reading, Writing
3 Speaking*
4 Golf
🌀 ADAPT IN (CONTEXTUAL SCOPE)
1 Forex🧨🧨🧨
2 Pool table* , Guitar* 🧨🧨 YOUTUBE + SONGSTERR 🧨🧨
3 Swimming
🌀 PERFORMANCE AND PERFECTION(COMPLETE)
1 Eat and Drink
2 Exercise and LAND and RENTAL PROPERTIES AND SERVICE WORK
3 Drive( AUTO* & MANUAL*)
👉 KINGSHIP
=> DAVID => KEVIN
=> DANIEL => MAULID
=> STATES => UK
=> HORUS hand gesture => ROYAL hand gesture
☀️ SUMMARY
✅
=> Natural Resources
=> Technology and Science
=> Gold and Money
✅
=>Military -> Harmonious
=>Complex (Scientific Research and Entertainment) -> Integrated
=>Industrial -> Industry
✅
Be Confident and Huumble
Lead with empathy
Organized
Ops Clandestine
Charismatic
Kaizen
✅ ✅
2PHASES TIME
🌑 Moon =>Saturn, Fox, Thoth ☀️ Sun =>Lion, Eagle
[Death]
[Chaotic] RIGHT WAY
[Mistake/Lustful Punished]
[Selfish Violence] REALISTIC
☀️ 33 DEGREES
✅
UPGRADE UPGRADE
SECURITY SECURITY
Time[MONEY] Truth[FEEL]
Energy Imagination
Vibration[VIOLENCE] Reality[JUSTICE]
✅
UPGRADE UPGRADE
Rigid[SECURITY] SECURITY[DOMINATE]
Light Know[CONQUER]
Opinion Connect the dots
Rigid Competitive[WIN]
✅
UPGRADE UPGRADE
SECURITY SECURITY
SIMPLE(ACCURATE, CORRECT, EFFECTIVE) Consistency[PRIORITIZE,CONSCIENCE,SCHEDULE]
ADAPTABILITY & SCALABILITY Discipline
PERFORMANCE & RELIABILITY Hardwork
-----------------------------------------------------------------------------------------------------------
✅ [ART]
Dark
|
Red
Red
|
White
Blue
Red -> Violence
✅ [SCIENCE]
HTML && CSS
Python
Rust
Zig
C
✅ [ANIMAL]
Snake & Lion
Bull
Eagle
|
Lion
Bear
Lion & Snake
☀️ 32 DEGREES
🧨🧨🧨🧨🧨🧨🧨🧨🧨 JAHBULON
BUL -> Baal ON- Osiris JAH -> Harmonious, Integrated, Industry
🌑 (Saturn, Fox, Toth)[USA] Mind/Nature MINE/ CAPITALIST ☀️ (Lion, Eagle)UK Heart/Life WE/ OURS/ SOCIALIST
SECURITY SECURITY -> Dieus
Cost(Nothing is free) Personal -> Equilibrium/ Balanced/ Neutral
Mistake(Punishment) Free & Forgiveness -> Vacuum
Justice(Selfish) Private -> Logos/ Logical
=> SCIENCE/PHILOSOPHY => RELIGION
Security(Death) Fear(Death)
Character(TIME) Dogma(TIME)
Science(Within Scope) Morality(Within Scope)
Imagination(Out of Scope) Disease(Out of Scope)
Logos(Poverty/Death) Ignorance(Poverty/Death)
🗝️ 🗝️
SECURITY KNOW
SOLUTION PROVIDER CONNECT THE DOTS
COMPETITIVE
✅ ✅ ✅
SECURITY SECURITY SECURITY
Brits Professional Consistency[PRIORITIZE,CONSCIENCE,SCHEDULE]
USA Follow-up Discipline
🧨🧨🧨(Imagination/Poverty/Death) Original Hardwork
1. (33 DEGREES) SENIOR SOFTWARE ENGINEER
2. (32 DEGREES) SENIOR SECURITY ENGINEER
3. COMPUTER ENGINEER
4. SOFTWARE ENGINEER
5. FRAMEWORKER
8. COMPUTER SCIENTIST 6. DEVELOPER
7. PROGRAMMER
9. ELECTRICAL & MECHANICAL ENGINEER
✅ [GENERAL]
UPGRADE
SECURITY
SIMPLE(ACCURATE, CORRECT, EFFECTIVE)
ADAPTABILITY & SCALABILITY
PERFORMANCE & RELIABILITY
✅ [SPECIFIC]
👉 WEB3 ENGINEER MAJOR
-> AI & Quantum
-> Distributed System
-----------------------------------------------------------------------
🧨🧨🧨BUNDLER ====> BUILD ENGINE
WebAssembly App (LLVM + Assembly)
Web App (Native) + Progressive Web App => WEBSITE
Mobile App (Native)
Desktop App (Native)
-----------------------------------------------------------------------
👉 ACHIEVING CROSSPLATFORM (PLATFORM AGNOSTIC)
1. LLVM + Assembly
2. KIVY(Language API Wrapper, Operating System and Kernel Driver API implementation)
👉 SOFTWARE-> INTERPRETED & COMPILED
👉 PROGRAMMING LANGUAGES
----------
HTML & HTMLX
CSS & SASS
🌀 PYSCRIPT(Python Implementation)
----------
PYTHON -> JSON
RUST(SAFE[Software Optimisation] & UNSAFE[Zig Implementation -> Hardware Optimisation]) -> JSON
----------
SQL & NOSQL
RestAPI & GraphQL
----------
C & ASSEMBLY
LLVM IR & ASSEMBLY
----------
TOML
BASH
POWERSHELL
----------
🌀 MICROPYTHON(Python Implementation)
POSIX
----------
👉 FRAMEWORKS
------------
WEB
Python -> Django, fastAPI, Flask
Rust -> Rocket, Actix
------------
MOBILE
Python -> Kivy
Rust -> Tauri
------------
DESKTOP
Python -> Tkinter[In-built/Standard] -> Kivy
Rust ->`Tauri
------------
👉 CLOUD SERVICES
Amazon(Large Applications), Google(Linux), Azure(Microsoft Windows)
👉 DEPLOYMENT
Heroku, Linode, NameCheap
===============================================================================================================================================
🧨 STUDYING BEST PRACTICE
----------------------------------------------------------
**** STRONGEST<=> HIGHEST <=> GREATEST
*1
HAVE THE CLEAR PICTURE OF A SUBJECT
HAVE AN OVERVIEW & BASIC UNDERSTANDING
HAVE AN DETAILED UNDERSTANDING
*2
1ST STAGE 2ND STAGE
CONSUMPTION PERIOD DIGESTION PERIOD
P => Procedural ==> Practice E => Evidence ==> Store
A => Analogous ==> Critic + Prior Studying R => Reference ==> Rehearse
C => Conceptual ==> Mapping
----------------------------------------------------------
*3
SIMPLE => [ACCURATE & CATEGORIZE]
COMPLEX => [RESEARCH & ADAPTIVE]
COMPLICATED => [REPETITION]
*4
CREATE => HYPOTHESISE
EVALUATE => PRIORITISE
ANALYSE => COMPARISON
APPLY[SIMPLE, COMPLEX, DIFFICULT] => PROBLEM SOLVING
UNDERSTAND => EXPLAIN
REMEMBER[REPETITIVE] => REGURGITATE
----------------------------------------------------------
👉 FINANCIAL ENGINEERING GUIDE -> Backup.md README SECTION
-> https://www.youtube.com/watch?v=UewlVUZliV0
🧨 Computational Finance -> https://www.youtube.com/watch?v=IRMn6JQvU8A&list=PL6zzGYGhbWrPaI-op1UfNl0uDglxdkaOB
🧨 Computational Finance Q&A -> https://www.youtube.com/watch?v=brKYNyyI2Zs&list=PL6zzGYGhbWrPzu3MojPq24denhjJ4kFCI
🧨 Financial Engineering Course -> https://www.youtube.com/watch?v=oX7iyvZsAak&list=PL6zzGYGhbWrMpjEKDtnrHWyIj-oVLKCYD
- EDUREKA
- free code camp
- LINKS
MICROTIK ROUTER CONFIGURATION GUIDE
👉 MICROTIK ZERO TO HERO -> https://www.youtube.com/watch?v=Gf-iCjh2Bwg&list=PLlEVCBdM7ELM3S7Ew3HtbaX6ZUB_Pfihh
👉 MICROTIK FUNDAMENTALS -> https://www.youtube.com/watch?v=76nK1LXyPMA&list=PLCvN_Pl1Blxh2ejJCGI4T-xzL3VrYtsKS
👉 Step by Step Configure Mikrotik Router as a switch -> GOOGLE DRIVE
👉 CYBERSECURITY GUIDE => OWASP CHEETSHEET -> https://cheatsheetseries.owasp.org/cheatsheets/GraphQL_Cheat_Sheet.html
👉 MIT NETWORK SECURITY -> https://www.youtube.com/watch?v=QOtA76ga_fY&list=PLBMO3ORTdwldCqlHVcbJSOa3GxLS6bpeE
👉 MIT DYNAMIC PROGRAMMING -> https://www.youtube.com/watch?v=OQ5jsbhAv_M&list=PLZES21J5RvsHOeSW9Vrvo0EEc2juNe3tX
👉 Algorithms & Data Structures -> https://www.youtube.com/watch?v=XGClOyyliPs&list=PL7yh-TELLS1HgoWUfxGzoaa7PEJ2Q-RfC
👉 Python Machine Learning Tutorial -> https://www.youtube.com/watch?v=jg5paDArl3E&list=PL7yh-TELLS1EZGz1-VDltwdwZvPV-jliQ
👉 DATA STRUCTURE AND ALGORITHM GUIDE => https://leetcode.com/discuss/general-discussion/494279/comprehensive-data-structure-and-algorithm-study-guide
👉 LEETCODE SOLUTION ALGRITHMS => Leetcode Solution Algorithms.pdf at [email protected]
Algorithms & Data Structures -> https://www.youtube.com/watch?v=XGClOyyliPs&list=PL7yh-TELLS1HgoWUfxGzoaa7PEJ2Q-RfC
👉 DESIGN AND ANALYSIS OF ALGORITHM GUIDE
BIG O Notation -> https://www.youtube.com/watch?v=XMUe3zFhM5c
Big O Part 1 – Linear Complexity -> https://www.youtube.com/watch?v=OMInLBKYAWg&list=PLTd6ceoshprfdzLWovxULl8Lt7RAFKTe5
Big O Part 2 – Constant Complexity -> https://www.youtube.com/watch?v=1Md4Uo8LtHI&list=PLTd6ceoshprfdzLWovxULl8Lt7RAFKTe5&index=2
Big O Part 3 – Quadratic Complexity -> https://www.youtube.com/watch?v=mIjuDg8ky4U&list=PLTd6ceoshprfdzLWovxULl8Lt7RAFKTe5&index=3
Big O Part 4 – Logarithmic Complexity -> https://www.youtube.com/watch?v=Hatl0qrT0bI&t=26s
Big O Part 5 – Linearithmic Complexity -> https://www.youtube.com/watch?v=i7CmolBf3HM&t=19s
Big O Part 6 – Summary of Time Complexities -> https://www.youtube.com/watch?v=XiGedDZGOM8
Big O Part 7 – Space Complexity versus Time Complexity -> https://www.youtube.com/watch?v=bNjMFtHLioA
🧨🧨🧨 ADVANCE -> https://www.youtube.com/watch?v=0IAPZzGSbME&list=PLDN4rrl48XKpZkf03iYFl-O29szjTrs_O
👉 DISTRIBUTED SYSTEM GUIDE
Distributed Systems(Overview) -> https://www.youtube.com/watch?v=cQP8WApzIQQ&list=PLrw6a1wE39_tb2fErI4-WkMbsvGQk9_UB
Distributed Systems(Advance) -> https://youtube.com/channel/UCnhpOONF1c1FtipDF8LPdqQ
👉 LINUX GUIDE => Linux Bible
Linux -> https://www.youtube.com/watch?v=ghWECXWi9kU&list=PL7yh-TELLS1FRoV4TCJjyEwh_Ye1o9sik
👉 SQL GUIDE => https://sqlbolt.com/
👉 SQLSTYLE GUIDE -> https://github.com/DancunManyinsa/sqlstyle.guide
1. DATABASE Normalization(1-4) & SQL Tutorial -> https://www.youtube.com/watch?v=y03oYWDLu0Q&list=PLTd6ceoshprdZIu24-_u00-m2Vf9o5aVC
2. SQL Tutorial(Overview) -> https://www.youtube.com/watch?v=kXQccIONcc4&list=PL7yh-TELLS1FVLG6AFoT-3lVcbJiPTWze
4. SQL JOINS -> https://www.youtube.com/watch?v=fx-_7g-Ssgw&list=PLcnJIHtHiTA3eqEjdvS_JBvtcRstjWufE
👉 VIM GUIDE => Install Neovim
Vim Tutorial -> https://www.youtube.com/watch?v=jXud3JybsG4&list=PL7yh-TELLS1Eevqs4-XmlSfrFOKb5C97K
👉 PYTHON GUIDE => Python Official Documentation
👉 PTHON CODING STYLE (PEP8) => PEP8 Official Documentation
1. Fundamental Concepts of Object Oriented Programming -> https://www.youtube.com/watch?v=m_MQYyJpIjg
Python OOP Fundamentals -> https://www.youtube.com/watch?v=Ej_02ICOIgs
Python OOP Edureka-> https://www.youtube.com/watch?v=SRu1GAfr3LA
PEP8 Coding Guidelines -> https://www.youtube.com/watch?v=U2vDWc5sce0
2. Importing your python files(Overview) -> https://www.youtube.com/watch?v=GxCXiSkm6no
Importing your python files(Advance)-> https://www.youtube.com/watch?v=e9yMYdnSlUA
3. Tips to organize Python code -> https://www.youtube.com/watch?v=e9yMYdnSlUA
4. Packaging Your Python Code With pyproject.toml -> https://www.youtube.com/watch?v=v6tALyc4C10&list=TLPQMDcxMDIwMjNyZziH_2-ouA&index=4
5. Make Your Python Code More Professional -> https://www.youtube.com/watch?v=PDqy1HoA3QM&list=TLPQMDcxMDIwMjNyZziH_2-ouA&index=5
6. Makefiles in Python For Professional Automation -> https://www.youtube.com/watch?v=Yt-UF7fNLJE&list=TLPQMDcxMDIwMjNyZziH_2-ouA&index=3
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
7. Python Beginner Tutorial -> https://www.youtube.com/watch?v=-eaFKumWT1k&list=PL7yh-TELLS1E6dNCzfQl-NG-KJP3C-4mc
8. Python Intermediate Tutorial -> https://www.youtube.com/watch?v=2S7Xxz9PhaU&list=PL7yh-TELLS1F3KytMVZRFO-xIo_S2_Jg1
9. Python Advanced Tutorials -> https://www.youtube.com/watch?v=KSiRzuSx120&list=PL7yh-TELLS1FuqLSjl5bgiQIEH25VEmIc
10. Python Tips and Tricks -> https://www.youtube.com/watch?v=Wz5oLT55_jY&list=PL7yh-TELLS1GNyuvPsFEqb7JVMEUTtuau
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
11. Algorithms & Data Structures -> https://www.youtube.com/watch?v=XGClOyyliPs&list=PL7yh-TELLS1HgoWUfxGzoaa7PEJ2Q-RfC
12. Python Data Science Tutorial -> https://www.youtube.com/watch?v=hVcEv7rEN24&list=PL7yh-TELLS1FfO5Q8KHK31VgsrcnWcTAk
13. Python Machine Learning Tutorial -> https://www.youtube.com/watch?v=jg5paDArl3E&list=PL7yh-TELLS1EZGz1-VDltwdwZvPV-jliQ
👉 RUST GUIDE => Rust Official Documentation
=> 🧨 Rust IMAGES
=> 🧨 RUST CRASH COURSE https://www.youtube.com/watch?v=lzKeecy4OmQ
1. Rust Bites -> https://www.youtube.com/watch?v=gvHPRaZHgD4&list=PLai5B987bZ9C0xK1tZ2wsudOo2zjrD9xX
2. The Rustlang Book -> https://www.youtube.com/watch?v=OX9HJsJUDxA&list=PLai5B987bZ9CoVR-QEIN9foz4QCJ0H2Y8
3. Rust Explained -> https://www.youtube.com/watch?v=784JWR4oxOI
1) Zero cost abstraction
2) Rust Ownership based on Resource Acquisition Is Initialization (RAII) Design Pattern
- It states that resources like: allocated memory, file handles and database connections must be tied to object lifetime.
3) Algebraic Data Types(ADT) or Variant types. Example: Enum and Struct
4) Polymorphism using Traits and Generics
- Traits are interfaces that define set of functions and methods that types can implement. -> Focuses on functions.
- Generics enables you to write code that is abstracted over types which leads to reusable and efficient code. -> Focuses on arguments.
Advantages of Rust Polymorphism
- Flexibility and composition.
- Non-invasive and extensive.
- No fragile base class problem.
- Default support of static dispatch hence making code performant.
5) Async/Await
6) Macros -> Metaprogramming
- Enables a program to manipulate or generate code at run-time/compile-time.
-> Macros is a language feature that enables programmer to define custom syntax and perform code generation/code transformation.
-> Provides a way to write that modifies or generates code during run-time/compile-time.
NOTE: Macros are hygienic by default hence variables and identifiers do not accidentally clash with variables from surrounding code.
7) Cargo
Advantages of cargo
- Dependency management.
- Project building.
- Testing. -> Use CLIPPY for better error handling and error outputs.
- Package distribution.
4. Memory Management -> https://www.youtube.com/watch?v=NnLdGKoz1ls&list=PLai5B987bZ9BnS7dBlTF9HRop55vYUL7r
5. Idiomatic Rust -> https://www.youtube.com/watch?v=6b-8gpLCrrg&list=PLai5B987bZ9A5MO1oY8uihDWFC5DsdJZq
6. Rayon crate for Parallelism -> https://www.youtube.com/watch?v=YxG7PhZ3fb4&list=PLai5B987bZ9DIGuIrI2-KHqe5ijBTVpk2
👉 C GUIDE => C Official Documentation
=> C KNOWLEDGE FOR COMPUTER PROGRAMMING FUNDAMENTALS
=> 🧨 C IMAGES
Pointers -> https://www.youtube.com/watch?v=Hb-bmAmbwvo&list=PL7yh-TELLS1GafZOVHhCPaCQUt8HrfCZE&index=9
C to LLVM IR Assembly -> https://www.youtube.com/watch?v=wt7a5BOztuM
👉 LINEAR ALGEBRA GUIDE -> https://www.youtube.com/watch?v=6MD_CTQGDyo&list=PLTd6ceoshprfZs1VIzGHDt-MYgVewC5tc
👉 CRYPTOGRAPHY GUIDE
Cryptography (Overview) -> https://www.youtube.com/watch?v=v8nlTZleKFw&list=PLTd6ceoshprcUyoyOQ_2dCvr5GPi5w_T7
Cryptography (Detailed) -> https://www.youtube.com/watch?v=1bSjcU2GeG0&list=PL58C6Q25sEEHXvACYxiav_lC2DqSlC7Og
👉 DATA REPRESENTATION GUIDE-> https://www.youtube.com/watch?v=cJNm938Xwao&list=PLTd6ceoshprfijQztP-IKey4OV7nkr_va
👉 COMPILATION GUIDE -> https://www.youtube.com/watch?v=cxNlb2GTKIc&list=PLTd6ceoshpreZuklA7RBMubSmhE0OHWh_
👉 OPERATING SYSTEM GUIDE
OS (Overview) -> https://www.youtube.com/watch?v=p9yZNLeOj4s&list=PLTd6ceoshprdCI72yuKI_VexnJygy-34e
OS (Detailed) -> https://www.youtube.com/watch?v=QTQ8zym8Au0&list=PLWPirh4EWFpGkHH9JTKH9KsnfAA471Fhy
👉 COMPUTER ARCHITECTURE GUIDE -> https://www.youtube.com/watch?v=_I8CLQazom0&list=PLTd6ceoshprfg23JMtwGysCm4tlc0I1ou
👉 DIGITAL ELECTRONICS GUIDE
Bitwise Operator -> https://www.youtube.com/watch?v=bizj3dle8Qc&list=PLTd6ceoshprdJUxsjWLGidOToR1_9uIq_
Logic Gates -> https://www.youtube.com/watch?v=fw-N9P38mi4&list=PLTd6ceoshprfc_VVJYunO1BN9peCTMQgr
Karnaugh Maps -> https://www.youtube.com/watch?v=3vkMgTmieZI&list=PLTd6ceoshpre9qoX0l_uyNnL_4W0Yl-FT
Latches and FlipFlops -> https://www.youtube.com/watch?v=-aQH0ybMd3U&list=PLTd6ceoshpreKyY55hA4vpzAUv9hSut1H
Boolean Algebra -> https://www.youtube.com/watch?v=EPJf4owqwdA&list=PLTd6ceoshprcTJdg5AI6i2D2gZR5r8_Aw
👉 ANALOGUE ELECTRONICS GUIDE => Electronics Images
MIT CLASSICAL MECHANICS/PHYSICS -> https://www.youtube.com/watch?v=nvjwr9PTBP8&list=PLaLOVNqqD-2HUv0qoYzSj0SBOIO_pLtW4
MIT STATISTICAL MECHANICS -> https://www.youtube.com/watch?v=4RX_lpoGRBg&list=PLUl4u3cNGP60gl3fdUTKRrt5t_GPx2sRg
MIT ENGINEERING DYNAMICS -> https://www.youtube.com/playlist?list=PLUl4u3cNGP62esZEwffjMAsEMW_YArxYC
👉 COMPUTER MOTHERBOARD ELECTRONICS GUIDE -> https://m.youtube.com/@electronicsrepairbasics_erb
===============================================================================================================================================
LOCAL IP ADDRESS -> For connecting to a local server
#check ip address
Windows use systeminfo.
Linux use ifconfig. Default local IP address is 127.0.0.1 OR check the wlp2s0 option.
PUBLIC IP ADDRESS -> For connecting to a public server
Windows GUI check the ip address of network connected to your device. i.e Ethernet or WIFI
Linux GUI check the ip address of network connected to your device. i.e Ethernet or WIFI
SOCKET PROGRAMMING -> End-point communication i.e kernel, internet, bluetooth
Example:
AF_INET -> IPV4
AF_INET6 -> IPV6
TCP -> (Transmission Control Protocol) i.e SOCK_STREAM ->USE_CASE -Detecting and maintaining connection requests, Sending messages/files
Reliable - Detects packet loss
Connection-based
Sequential
Byte-stream
Keeps up connection
UDP -> User Datagram Protocol i.e SOCK_DGRAM -> USE_CASE - Real-time data transmission like video/audio, online gaming
Sends one datagram
No order
No guarantee of receiving packets
Real-time
Faster
Less network and PC stress
Asymmetrical RSA Private and Public Key
Personal - (Secret Key)Private Key -> Decrypt message (RECEIVED)4. -> Signing(sent)2.
Acessible to the public - Public Key -> Encrypt message (SENT)1. -> Verifies Signature and Message (received)3.
👉 Python Applications
Acknowledgement: NeuralNine
------------------------------------------------------
Run Youtube Script
RUN PYINSTALLER ...
# WINDOWS
GUI Voice recorder App -> https://www.youtube.com/watch?v=u_xNvC9PpHA
GUI AI Voice Assistant Bot App -> https://www.youtube.com/watch?v=F62wb_jfUUw
GUI Motion Detection App -> https://www.youtube.com/watch?v=QPjPyUJeYYE
GUI Two players Chess App
GUI Clock App
------------------------------------------------------
# BOTH WINDOWS AND LINUX
Bluetooth Script -> https://www.youtube.com/watch?v=8pMaR-WUc6U
RUN NGROK ...
# WINDOWS
GUI Zoom App -> https://www.youtube.com/watch?v=bJOvYgSqrOs
# BOTH WINDOWS AND LINUX
HTTP Server Script
TCP + Advanced TCP -> https://www.youtube.com/watch?v=F_JDA96AdEI
UDP -> https://www.youtube.com/watch?v=3qlhbez-RPI
Online Chess
------------------------------------------------------
✅
✅ Software Programs which can be executed.
# BOTH WINDOWS AND LINUX
HTTP Server Script
# WINDOWS => ON DIFFERENT MACHINES
Bluetooth Script
Zoom Script
# LINUX => ON EITHER SAME MACHINE OR DIFFERENT MACHINES
TCP + Advanced TCP
UDP
---------------------------------------------------------------------------
👉 RUN PYINSTALLER ...
GUI Two players Chess App
GUI Clock App
---------------------------------------------------------------------------
---------------------------------------------------------------------------
👉 RUN NGROK ... => https://www.youtube.com/watch?v=hs7GLsHQCPQ
HTTP Server Script
---------------------------------------------------------------------------
-> ***👉 Python Script to Control my Lights using phue philips bulb -> https://www.youtube.com/watch?v=kSruoqDTYt0
👉 Automating Scripts + (Youtube,Image, PDF merger) scripts -> https://www.youtube.com/watch?v=vEQ8CXFWLZU
👉 Python built-in http server
python3 -m http.server
OR
python -m http.server
#Check for local ip address
#windows use systeminfo.
#linux use ifconfig. Then check the wlp2s0 option. The default local ip address 127.0.0.1 does not work.
-> Convert a file to zip folder in order to download it to another machine
---------------------------------------------------------------------------
✅ RUN NGROK ... => https://www.youtube.com/watch?v=hs7GLsHQCPQ
HTTP Server Script
python3 -m http.server 9999/OR any other port number past 1000
Then open a second terminal and run
ngrok config add-authtoken 2V4ZbMGhudJBSB8F1Z303Qf0yFc_2i9fWkV8ECNg6bL8zfABS
ngrok http [port-above]
Then within go to a different machine and type the [link displayed] [e.g https://97f0-102-219-208-122.ngrok-free.app] on the browser
---------------------------------------------------------------------------
👉 Hide computer IP address for watching movies
import requests
proxies = {
'https': '96.80.235.1:8080'#Proxy server of your preferred country
}
response = requests.get("https://ipinfo.io/json", proxies=proxies)
print(response.text)
# Once it returns a json format text file with proxy ip address of your preferred country, then it successful. Use it in movie platforms, but some platforms have advanced software and techniques to identify your IP address.
👉 Clock script -> https://www.youtube.com/watch?v=l7IMBy4_nhA -> Download and Install Ds-Digital from official website
from tkinter import *
from tkinter.ttk import *
from tkinter import Label, Tk
from time import strftime
#Configure window
window = Tk()
window.title("Digital Clock")
#window.geometry("1000x800")
window.configure(bg="white smoke")
window.resizable(False, False)
def time():
"""
day_string = strftime("%A")
day_label.config(text=day_string)
date_string = strftime("%B %d, %Y")
date_label.config(text=date_string)
"""
time_string = strftime("%a %d-%m-%Y %I:%M:%S %p")
label.config(text=time_string)
label.after(1000, time)
"""
day_label = Label(window,font=("ds-digital", 50), background="white smoke", foreground="black")
day_label.pack()
date_label = Label(window,font=("ds-digital", 50), background="white smoke", foreground="black")
date_label.pack()
"""
label = Label(window, font=("ds-digital", 30), background="black", foreground="cyan")
label.pack(anchor="center")
time()
window.mainloop()
👉 Photo Compressor
This is an awesome Automation Script that will Compress your Photos into lower Sizes by keeping the Quality the same.
import PIL
from tkinter.filedialog import *
from PIL import Imagefile_loc=askopenfilenames()img = Image.open(file_loc[0])img.save("Compressed.jpg", "JPEG", optimize = True, quality = 10)
print("Image is Compressed")
👉Image Watermarker
You had Probbally used different Photo Editing software to watermark your photos. This simple python script using the PIL module will watermark any image. You can set the Text, location, and even Font.
from PIL import Image
from PIL import ImageFont
from PIL import ImageDrawdef watermark_Image(img_path,output_path, text, pos):
img = Image.open(img_path)
drawing = ImageDraw.Draw(img)
black = (10, 5, 12)
drawing.text(pos, text, fill=black)
img.show()
img.save(output_path)img = '1.jpg'
watermark_Image(img, 'watermarked.jpg','Python', pos=(0, 0))
👉 InstaDpViewer
This Python automation script will help you to view the DP of any Instagram User. Script use module instaloader which will take username as input and download the DP as an output.
import instaloader
ig = instaloader.Instaloader()
dp = input("Enter Insta username : ")
ig.download_profile(dp , profile_pic_only=True)
print("Your Image is Downloaded")
👉 Plagiarism Checker
This is an awesome script that will help you to check the Plagiarism between two files. Now you don’t longer need any software or a web app for Plagiarism checking. This will do your work in a second.
from difflib import SequenceMatcherdef Plagerism_checker(f1, f2):
with open(f1,errors="ignore") as file1,open(f2,errors="ignore") as file2:
f1_data=file1.read()
f2_data=file2.read()
checking=SequenceMatcher(None, f1_data, f2_data).ratio()print(f"These files are {checking*100} % similar")file_1=input("Enter file 1 path: ")
file_2=input("Enter file 2 path: ")
Plagerism_checker(file_1, file_2)
👉 YT Video Downloader
This is another simple automation script to download Youtube videos. Now you don’t need any web app or software, just use the below code to download any video.
🌀 Best way
from pytube import YouTube
from sys import argv
link = argv[1]
yt = YouTube(link)
print("Title: ", yt.title)
print("Views: ", yt.views)
yd = yt.streams.get_highest_resolution()
# Add a folder here in both linux and windows
yd.download('s/YouTubeDownloadFolder')
# Windows location to retrieve file or to store downloaded file
r"C:\Users\User\Desktop"
# Executing python script
python3 youtube_downloader.py "https://www.youtube.com/watch?v=--BSK3cxh3U"
🌀 Using an automated script
1 Ensure script is on the same directory are Python script.
2 If you have used external dependencies, ensure python pip environment is activated to get access to the external dependencies used in the script.
3 nano youtube_downloader.sh
4. # Write the following:
#!/bin/bash
cd /home/dan/Desktop/Projects/Python Scripts/youtube-downloader
python youtube_downloader.py $1
5 ./youtube_downloader.sh "https://www.youtube.com/watch?v=5t8BStaWRgM"
1st way
from pytube import Youtube
#Insert video URL
yt_video = YouTube("<video link>")
#File extension and resolution
v_file = yt_video.streams.filter(file_extension="mp4").get_by_resolution("1080p")
#Download and save video
v_file.download("save_path i.e C:\\Users\Dan\Desktop")
print(">>>Download Complete")
2nd way
import pytubelink = input('Enter Youtube Video URL')
yt = pytube.Youtube(link)
yt.streams.first().download()
print('downloaded', link)
👉 Convert PDF to CSV
Sometimes we need to convert our PDF data into CSV format, So for that kind of work, this Python script will be handy for you. I had mentioned the two methods to do this work.
import tabula
import camelot# Method 1
filename = input("Enter File Path: ")
df = tabula.read_pdf(filename, encoding='utf-8', spreadsheet=True, pages='1')
df.to_csv('output.csv')# Method 2
tables = camelot.read_pdf('file.pdf')
tables.export('file.csv', f='csv', compress=True)
👉 File Encrypt and Decrypt
Want to lock your File then this script will be handy for you. Below I mention Encryption and decryption code that can work on any file.
from cryptography.fernet import Fernetdef encrypt(filename, key):
fernet = Fernet(key)
with open(filename, 'rb') as file:
original = file.read()
encrypted = fernet.encrypt(original)
with open(filename, 'wb') as enc_file:
enc_file.write(encrypted)def decrypt(filename, key):
fernet = Fernet(key)
with open(filename, 'rb') as enc_file:
encrypted = enc_file.read()decrypted = fernet.decrypt(encrypted)
with open(filename, 'wb') as dec_file:
dec_file.write(decrypted)# Generate Key
key = Fernet.generate_key()
filename = input("Enter Your filename: ")
encrypt(filename, key)
decrypt(filename, key)
👉 Battery Notification
You had seen a battery notification on your Mobile Phone. So what if your Laptop also notifies you about your Battery status. This Python Script will do the exact same work by using 3 modules it can be Battery Notifier. Check out the code below.
import time
import pyttsx3
import threading
from win10toast import ToastNotifierbot=pyttsx3.init()
bot.setProperty('rate',110)
bot.setProperty('volume',3)
toaster = ToastNotifier()def display_notification(text):
toaster.show_toast(text, duration=8)
while toaster.notification_active():
time.sleep(0.003)def Battery_Notification():
while (True):
time.sleep(2)
battery = psutil.sensors_battery()
plugged = battery.power_plugged
percent = int(battery.percent)if percent < 15:
if plugged == False:
processThread = threading.Thread(target=display_notification, args=("Your Battery at "+str(percent)+"% Please Plug the charger",))
processThread.start()
bot.say("Your battery is getting low so plug your charger")
bot.runAndWait()elif percent >= 99:
if plugged == True:
processThread = threading.Thread(target=display_notification, args=("Charging is getting complete",))
processThread.start()
bot.say("Charging is Completed")
bot.runAndWait()Battery_Notification()elif percent >= 99:
if plugged == True:
processThread = threading.Thread(target=display_notification, args=("Charging is getting complete",))
processThread.start()
bot.say("Charging is Completed")
bot.runAndWait()
👉 Convert Images to PDF
If you had a lot of images and looking for converting them into a single Pdf then this automation script will be handy for you.
import os
import img2pdf#Method 1
with open("Output.pdf", "wb") as file:
file.write(img2pdf.convert([i for i in os.listdir('Path of image_Directory') if i.endswith(".jpg")]))#Method 2
from fpdf import FPDF
Pdf = FPDF()list_of_images = ["1.jpg", "2.jpg"]
for i in list_of_images: # list of images with filename
Pdf.add_page()
Pdf.image(i,x,y,w,h)
Pdf.output("yourfile.pdf", "F")
👉 Text to Speech AI Bot
If you had known about the Jarvis AI then this script will work similarly. It uses google Text to Speech API to convert your written Text to AI bot voice. checkout the code below.
from pygame import mixer
from gtts import gTTSdef main():
tts = gTTS('Learn Python from Medium')
tts.save('python.mp3')
mixer.init()
mixer.music.load('python.mp3')
mixer.music.play()
if __name__ == "__main__":
main()
👉 Battery Notification (Working Script)
#Best Practice
from psutil import sensors_battery
battery = sensors_battery()
def main():
print(f"Percentage: {battery.percent}")
print(f"Power Plugged: {battery.power_plugged}")
print(f"Seconds Left: {battery.secsleft}")
if __name__ == "__main__":
main()
#Novice approach
from psutil import sensors_battery
battery = sensors_battery()
if __name__ == "__main__":
print(f"Percentage: {battery.percent}")
print(f"Power Plugged: {battery.power_plugged}")
print(f"Seconds Left: {battery.secsleft}")
👉 Battery Notification (Working Script)
import psutil
battery_status = psutil.sensors_battery()
print(battery_status)
print("Percentage of battery: %s percentage" % (battery_status.percent,))
print("Approximate time remaining: %s seconds" % (battery_status.secsleft,))
print("Is power cable connected: %s" % (battery_status.power_plugged,))
👉 Battery Notification
import psutil
import pyler
from plyer import notification
# From psutil we import sensors battery class which gives us battery percentage
threshold = int(input('Enter the threshold: '))
battery = psutil.sensors_battery()
percent = battery.percent
while (True):
battery = psutil.sensors_battery()
cur_per = battery.percent
change = cur_per - percent
diff = abs(change)
''' We calculate the change in the battery
and show notification if battery level increases or decreases'''
if (diff >= threshold):
notification.notify(
title="Battery Percentage",
message=str(cur_per) + "% Battery Remaining",
timeout=5
)
percent = cur_per
continue
👉 Battery Notification
import psutil
import time
import pyttsx3
from win10toast import ToastNotifier # also need to install win32api
import threading
toaster = ToastNotifier()
x=pyttsx3.init()
x.setProperty('rate',130)
x.setProperty('volume',8)
count = 0
def show_notification(show_text):
toaster.show_toast(show_text,
icon_path='battery_indicator.ico',
duration=10)
# loop the toaster over some period of time
while toaster.notification_active():
time.sleep(0.1)
def monitor():
while (True):
time.sleep(10)
battery = psutil.sensors_battery()
plugged = battery.power_plugged
percent = int(battery.percent)
if percent < 40:
if plugged == False:
processThread = threading.Thread(target=show_notification, args=("Your Battery at "+str(percent)+"% Please plug the cable",)) # <- note extra ','
processThread.start()
x.say("Your battery is getting low so charge it right now")
x.runAndWait()
count = 0
elif percent == 100:
if plugged == True:
processThread = threading.Thread(target=show_notification, args=("Charging is getting complete",)) # <- note extra ','
processThread.start()
x.say("Charging is getting complete")
x.runAndWait()
elif percent == 90:
if plugged == True:
if count == 0:
processThread = threading.Thread(target=show_notification, args=("Your Battery at 90% Please plug out the cable",)) # <- note extra ','
processThread.start()
x.say("Your battery at 90% ")
x.runAndWait()
count = count + 1
if __name__ == "__main__":
monitor()
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% USEFUL CONTACTS & LINKS HASS Consult 0709479479 020 3258000
0745238973 EABL Rose Kibiru 0752417150 Driver 0769863503 Kiarie 0720298571 Mbugua [email protected] @James System 0721385078 @Sam-comrade ->0713835028
[email protected] Telephone
0733571368 Telephone
0722488400
https://foht.co.ke/ Software Engineer https://thintech.co.ke/ Consultance
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
SAF 07->0,1,2,4,5,6,9
AIRTEL 07->3,8
TELCOM 07->7
send money SAF -> TELCOM->263263 SAF->AIRTEL-> 220220
ADOBE PHOTOSHOP 1045-1656-1984-6490-9561-5688
http://127.0.0.1:8000/register?next=/
Cybersecurity firms
onsite cybersecurity firm contact Sam for location
Safaricom
Serianu
Sentinel
👉 Important links
Law link
https://www.qub.ac.uk/courses/undergraduate/2021/law-senior-status-llb-m101/
Datascience Concepts
https://t.co/zfWgEfyA8d?amp=1
https://t.co/SQamq5CSRQ?amp=1
https://github.com/jakevdp/PythonDataScienceHandbook/tree/master/notebooks
Network and Software Engineering https://mega.nz/#F!rqpiBQIa!aqY8L9NmaifoXTGVQogiWg
Cybersecurity and AI Engineering https://mega.nz/#F!2fRgmYSa!dvEPOubQEBUiRaXu7SktBg!jXw3ySIS
Cybersecurity
[7:01 PM, 3/28/2019] Anthony - CEH: https://katcr.co/torrent/364470/udemy-practice-your-first-penetration-test-kali-amp-metasploit-lab.html
[7:02 PM, 3/28/2019] Anthony - CEH: https://monova.to/167CEE868BB34A178B72B530BE27CD98EFBE0688
[7:03 PM, 3/28/2019] Anthony - CEH: https://katcr.co/torrent/307838/udemy-ethical-hacking-with-metasploit-the-penetration-testing-tool.html
Data Engineer: https://leetcode.com/discuss/general-discussion/494279/comprehensive-data-structure-and-algorithm-study-guide
Data Structures: https://youtu.be/B31LgI4Y4DQ
Applying for Attachment:
Networks
Unitel Kenya Contacts:
0722774801
0722782035
[email protected]
Vision Plaza, Mombasa RD.
Data Science https://okhi.recruiterbox.com/jobs/fk0j7cr/
Cybersecurity https://www.un-yk.com/
Freelance Upwork
Fitness -> Chris Heria https://www.youtube.com/watch?v=3p8EBPVZ2Iw
<a href="mailto:[email protected]" target="_blank">
Email: [email protected]
</a>
web: gunicorn waweru.wsgi --log-file -
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% PHONE BACKUP
https://findmymobile.samsung.com
CHECK IMEI CODE ORIGINAL ANDROID PHONE ===> *#06# CHECK ORIGINAL SAMSUNG PHONE ===> #0#
SAMSUNG S8 Model number: SM-G950FD Serial number: RF8K21X3L2K IMEI (slot 1): 355258091801717 IMEI (slot 2): 355259091801715
REDMI Note 11 Model number: 2201117TG Serial number: 5e8e89e1 IMEI (slot 1): 867654067277828 IMEI (slot 2): 867654067277836
LINUX CHECK SERIAL NUMBER -> sudo dmidecode -s system-serial-number X1 Carbon ThinkPad Serial Number: PF15VBBQ Machine Type Model: 20KGS6J300 BIOS Supervisor: Right index/ devopsengineer
.................................................
=============================================================================================================================================================================
KRA ---> [email protected] ---> 0711444518 A012540149K engineerDaniel@1234
https://adili.eacc.go.ke/ LOGIN -> [email protected] -> CONARDMOMANYI@1daniel
Microsoft Teams: conardmomanyi1234 Microsoft acc: [email protected] => Passwd: Conardmomanyi123 => Security code for 0711444518 => Microsoft Teams, Skype, Microsoft OneDrive, Microsoft Office 365 Microsoft acc: [email protected] => Security code for [email protected] => Microsoft Teams, Skype, Microsoft OneDrive, Microsoft Office 365
SAF PORTAL: @@[email protected]@
SAF To easily reset your M-PESA PIN in future, set your M-PESA security questions. Dial *334#, select My account>M-PESA PIN Manager >Set M-PESA security questions,
456 Fuliza
fam ID: 35687960 #equity: 0764465804 CHEDA:1487 #jkuat co-op acc: 01129098952900 #equity acc: 1520179881302 #equity business paybill: 247247 National saf: 0711444518 PIN1:1274 PIN2:8411 PUK1:98828111 PUK2:15573510 CHEDA:1496 MOM SAF NUMBER: 0721160802 PIN: 9970 saf buy airtime to telcom baybill 215215 acc telcom number
Work saf: 0706738969 PIN1:8218 PIN2:5372 PUK1:42844297 PUK2:28282050 CHEDA:1493
Fam
WORK telcom: 0771587812 AIRTEL 0787062079 PIN => 0000 PUK => 38908427
phone: 1493 whats-app:147369
[email protected] conardmomanyimoruri [email protected] CONARDMOMANYI@123456 [email protected] CONARDMOMANYI@654321 BUSINESS ACCOUNT[Cybersecurity and AI ][email protected] SUNANDPLANETS => [email protected] [email protected] SUNANDPLANETS => [email protected]
github: [email protected]: conardmomanyimoruri [email protected]: conardmomanyimoruri
discord: softwareengineer [email protected]: QWERTY@987
slack : Sign in with [email protected]
telegram: Daniel @daniel_softwareengineer +254711444518 David @david_softwareengineer +254706738969
blueskyapp: https://bsky.app/ @engineerdancun ALPHA@987
Datalore -> [email protected] Heroku -> GITHUB Ngrok -> GITHUB LocalXpose -> LocalXposevPYQ3NXx9T!deGY
linode -> GITHUB SimpleAnalytics -> [email protected] Cloudflare -> [email protected] : 1Paste, drop, or click to add files
Google AdSense -> [email protected]
=============================================================================================================================================================================
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ PEPPERSTONE ---> [email protected] Wallet#1 -> Johnson1*@#!33A Professional Trader -> Royal^&*99GRAnd @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ EXNESS ---> [email protected] WEB -> ALPHA1one@987# PN ---> 979891
Pro Trader ACC -> Quant*@987# Pro DEMO -> Pro#1Trader
Standard Trader ACC -> Python*@#97 Standard DEMO -> Standard#1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ FUNDEDNEXT ---> [email protected] -> CHRISTINE@9# @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ Fundingpips ---> [email protected] -> Alpha1@987# @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ INGOT ---> [email protected] WEB -> DAVIDQwerty@987#
You can login to your demo account in MetaTrader using the following details:
Investor (Read-Only Password): !NgNxH13
Account Type: PRO-USD
Platform: MT5
Server: IngotAfrica-Demo2 Login: 999971273 Master: Daniel@3#
For security reasons, we recommend that you change your MetaTrader password after your first login. @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ HFM ---> [email protected] WEB -> QuantTrader@97# ACC -> DAVIDTrader@97# Your email/ticket reference number is: 3918327963. Please include this ticket number in the subject line of all future correspondence regarding this issue. @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ Forex Blue quantdancun -> [email protected] -> DancunIQ@543&& @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
=============================================================================================================================================================================
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ BINANCE ---> [email protected] Entity ==> LIGHT TECHNOLOGIES ==> Quantitative99%@EASING#33 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ SKRILL ---> [email protected] ---> 0706738969 ==> MasterClass@ALPHA1&!BLISS33 PN ---> 339791 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
PAYPAL ---> [email protected] - --> 0711444518 ==> &Newton99%#SCIENTIST <---OR---> GOOGLEPAY ---> [email protected] ---> 0706738969
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
=============================================================================================================================================================================
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ SAVINGS => DTB ---> [email protected] ---> 0711444518 PN ---> %%%-ATM DETAILS-%%% ACC 382 ===> DRIVE [email protected] *382# Paybill -> 516600 5951948001 -> 963385 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ SAVINGS => DUKASCOPY_Swiss_Banking_Group ---> [email protected] ---> 0706738969 PN ---> %%%-ATM DETAILS-%%% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ SAVINGS => SBM ---> [email protected] ---> 0706738969 PN ---> %%%-ATM DETAILS-%%% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
SAVINGS => ABSA ---> [email protected] ---> HOME FAM NUMBER PN ---> %%%-ATM DETAILS-%%%
BUSINESS => ABSA ---> [email protected] ---> HOME FAM NUMBER PN ---> %%%-ATM DETAILS-%%%
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
=============================================================================================================================================================================
OS:
Windows Admin devopsengineer/d@97 Windows Vbox Admin debian devopsYTRewq@987 Windows Vbox Root debian devopsYTREWQ@987
Ubuntu Admin devopsYtrewq@987 Ubuntu Root devopsYTREWQ@987 Ubuntu Vbox OSs && openSUSE Tumbleweed Hgfdsa Root HGFDSA Ubuntu Guest Magazine@987 Ubuntu 20.04 Kingdom@#987
Fedora Admin devopsYtrewq@987 Fedora Root devopsYTREWQ@987
STEAM DETAILS: DanAlphaandOmega Standarduser1
http://portal.jkuat.ac.ke/Home/Index SCT221-0781/2016 ID number %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
===============================================================================================================================================
----------------------------------
TRADES => Futures, Commodities, Stocks, Cryptocurrency, Forex
===> BEST PRACTICE
🧨Banks -> *LOANS(BIG MONEY, Small Money-> School Fees), *ATM
🧨Investor -> *DON'T HAVE PRIOR COMMUNICATION WITH INVESTOR* Cryptocurrency, Gift, =>[LAST RESOLUTION] Given property on Lease
🧨NO LIFE INSURANCE WHATSOEVER
🧨 WEALTH NEEDS TO [LAYERING & OWNERSHIP LAWS], [ADD UP/MAKE SENSE] and COMPLY WITH TAX REGULATIONS
===> FLAGS
🧨===> Fraud, Market Manipulation, Bribery, Sedition, Tax Evasion, Embezzlement, Identity Theft, Money Laundery, Forgery/ Counterfeiting, Theft
🧨===> Failure to follow regulation
Failure to report suspicious transactions
Tipping off a money launderer
Assisting a money launderer
----------------------------------
🧨
RESPONSIBLE<=>LAW
LEADER
CLASS
TRIBE
----------------------------------
1ST 🧨 INSERTION 🧨 WHO ARE YOU 🧨 ORIGIN**** COMPARABLE to Income, Social Status, Type of Occupation [Wealthy Family, Foreign Businesses/Currency, Local] 🧨=> [Shell Banks/Companies and Tax Haven]
LAYERING TRANSACTIONS ****(Far From Carcass)
INTEGRATION
2ND 🧨 INSERTION
NORMAL BUSINESS TRANSACTIONS
INTEGRATION
----------------------------------
🧨 LAYERING TRANSACTIONS
1 Customer Details updated before A LARGE Transaction => An Account Taker over and Layering Activity
2 LARGE DEPOSIT which is => NOT COMPARABLE to Income, Social Status, Type of Occupation
3 Detection of Structuring => Large number of transactions below a reporting threshold in a specific period of Time
4 Unusual spending pattern => NOT COMPARABLE to Income, Social Status, Type of Occupation
5 Dormant Accounts/Businesses => which display LARGE Transactions in Bank and Business Accounting Books
6 Low Buyers Diversity => Same Buyers on a business which usually receives variety of Buyers
7 Disproportionate Flow-Through => NOT MAKING BUSINESS SENSE[Spending More and No Profit] / NO PROFIT OR LOSS MAKING
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
8 High Risk Countries => Countries known for Bank Secrecy/Financial Crime/Tax Haven
9 Immediate Withdrawal to Private Wallets => Large Amount Deposited then IMMEDIATELY Large Amount Withdrawn
10 Cash Transactions => VERY LARGE CASH Transaction
11 Frequent conversions Crypto-FIAT or FIAT-Crypto => VERY LARGE and FREQUENT Transaction
12 A NORMAL BUSINESS MUST REGISTER PROFIT AND LOSS
----------------------------------
👉 TRADING RESOURCES
=====> PSYCHOLOGIST 🌀 Dr. Steven Burger
=====> 🧨🧨🧨🧨🧨🧨 FinCrime Agent
🌀 David, BULL[When You See It Swing The Bat Hard]
🌀 Daniel
--------------------------------------------
🌀 Kevin
🌀 Photon Trading
🌀 SMB Capital
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
STOCK EXCHANGE
🌀 Howard Marks
🌀 Buffet -> Berkshire Hathawa
FOREX EXCHANGE
🌀 Jim Simons -> Medallion Fund
🌀 Alex Gerko -> XTX
🧨 Quant Researcher, Quant Engineer, Quant Trader, Quant Portfolio Manager
🧨🧨🧨 Quant Engineer - LechGrzelak -> https://lechgrzelak.com/
🌀 Quant - Korean Yuppie -> https://www.youtube.com/watch?v=kFiiMymTtO0
🌀 Quant Books - QuantPy -> https://www.youtube.com/watch?v=ZSavDAaKy50
🌀 Quant Resources - Coding Jesus -> https://www.youtube.com/watch?v=cXNL4JKBL6w
👉 BEST PRACTICES
BUY BEST PRODUCT AT LOW & FAIR PRICE *NOT* WORST AT LEAST PRICE
SELL AT HIGH PRICE(BULL) => HIGH PROFIT
HAVE INVESTORS INTO BUYERS & INCREASE CASHFLOW
OPERATE AS LOWER MIDDLE-CLASS IN DEBT*
income#1(high+diverse)-expense#3(low+last)=investment#2(medium)+savings(high)#4
INVESTMENT
invest in stock(OWN stock/asset/property)
invest in (low cost)ETFs and index funds* ->Passive Investment
SAVINGS
save for emergency(high)->investment(medium)->asset/property(low)#Gold, Land and Property
PROTECTION
Lawyers
Tax Accountants(policies+best practices)
Shell companies(ownership) to avoid damage on you
----------------------------------------------------------------------------------------------------------------------------------
👉 MANAGEMENT
🧨Managing SYSTEM[ACTIVE]
🧨Managing CASHFLOW => High Incomes
=> Low Expenses
🧨Managing PEOPLE[ACTIVE]
[ACTIVE]
🧨Managing CASHFLOW => High Incomes
=> Low Expenses
🧨PARTNERSHIP
🧨MARKETING => High Sales
🧨 BUY ASSET <=> SELL SERVICE
🧨Managing SYSTEM[ACTIVE]
----------------------------------------------------------------------------------------------------------------------------------
👉 TYPES OF INCOMES
🧨 GOVERNMENT [MILITARY/LAWS] INSURANCE
🧨 MEDIUM [BUSINESSES/TRANSACTION] INSURANCE - Black Rock
🧨 PRIVATE [PROPERTY/EQUITY] INSURANCE - Black Stone
HIGH INTEREST 🌀 BONDS INVESTESTMENT IN [STRONG GVT] during HIGH INTEREST RATES OF CURRENCY in the capacity of Big Bank/ Big Business ===> because currency is being diluted hence losing value and offering it as loan will return interest/profit
LOW INTERSET 🌀 HOLD Big Business/ Big Property during LOW INTEREST RATES OF CURRENCY ===> because currency is gaining value
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
🧨 Reap BIG TIME
🌀 Give Little + Benefits
BANKS (STAKE)
==> Investment(Stake Holder + Receive Dividents)
BROKER
==> Portfolio (Buy Low Sell High)
HEDGE FUND (COMMISSION)
==> Salary
===============================================================================================================================================
DATA CENTRE (TELECOMMUNICATIONS, ROUTER CONFIGURATION, COMPUTER NETWORK CONFIGURATION, CLOUD/SERVER CONFIGURATION, HARDWARE)
👉 DATA CENTRE => www.flex.com
👉 CYBERSECURITY
🧨🧨🧨 DOMAIN Security
PHYSICAL Security
NETWORK Security
PERIMETER Security
ENDPOINT Security
APPLICATION Security
DATA Security
USER Education Security
👉 DOMAIN HOSTING
👉 WEB HOSTING
👉 EMAIL HOSTING
👉 SEARCH ENGINE OPTIMIZATION
🧨🧨 CYBERSECURITY && ARTIFICIAL INTELLIGENCE
👉 CONTROLS ===> DATA PREVENTION + RETRIEVAL
===> ENCRYPTION
🧨 TELECOMMUNICATIONS => [HARDWARE + CABLE] , NETWORK-RANGE DESIGN
🧨 ROUTER CONFIGURATION => [HARDWARE + CABLE] , CONFIGURATION
🧨 ASSEMBLE DESKTOP SERVER [INSTALLATION, MAINTENANCE AND CUSTOMIZATION]
🧨 COMPUTER NETWORK CONFIGURATION 🧨 CLOUD/SERVER CONFIGURATION => [SYSTEM ADMINISTRATOR]
🌀 [DESIGN AND ANALYSIS OF NETWORK] 🌀 HARDWARE => [Large storage + Modern] RAM, SSD, GRAPHICS CARD
🌀 WIRING
🌀 WINDOWS[CONFIGURE DNS, DHCP, Active Directory] 🌀 ASSEMBLE DESKTOP SERVER [INSTALLATION, MAINTENANCE AND CUSTOMIZATION]
🌀 LINUX && WINDOWS[NETWORK ADMINISTRATOR] 🌀🌀🌀 SOLDERING OF NON-CRUSIAL SEMI-CONDUCTORS
🧨🧨 SOFTWARE ENGINEERING
🧨🧨🧨🧨🧨 PROJECTS
👉 2 ELECTRONIC PROJECTS
👉 Network + CUSTOMER SUPPORT + MANAGEMENT SYSTEM
👉 MANAGEMENT SYSTEM
👉 HR MANAGEMENT SYSTEM
-> Education => Tutor, Books, School ERP
-> Car
👉 2FA, Bulk SMS Integration, OTP Money, Fingerprint
===============================================================================================================================================
👉 ENHANCE RESOLUTION
FONT SIZE -> 1.30
CHANGE SCALING FROM 200 TO 100
CHANGE TO PORTRAIT THEN RETURN SCALING TO 200 THEN RETURN TO LANDSCAPE
---------------------------------
🌀 ENSURE YOU EXECUTE ON X11/XORG
cvt 3840 2160
xrandr --newmode "MyMode" 712.75 3840 4160 4576 5312 2160 2163 2168 2237 -hsync +vsync
xrandr --addmode eDP-1 "MyMode"
cvt 2560 1440
xrandr --newmode "MyMode" 312.25 2560 2752 3024 3488 1440 1443 1448 1493 -hsync +vsync
xrandr --addmode eDP-1 "MyMode"
If Mymode exists give the newmode a different name
---------------------------------
🌀 INSTALL arandr TO CHECK DISPLAY TYPES (e.g eDP-1, Virtual-1)
sudo apt install arandr
👉 REDSHIFT
sudo apt-get install redshift redshift-gtk
redshift -O 3000K -r -P
OR
nano ~/.config/redshift.conf
[redshift]
temp-day=3600
temp-night=3300
gamma=0.8
brightness=0.8
adjustment-method=randr
location-provider=manual
[manual]
lat=31.78
lon=35.21
---------------------- GNOME INSTALLATION ---------------------
👉 VANILLA GNOME BROKEN DEPENDECIES
$ sudo apt install vanilla-gnome-desktop
This worked successfully and I was able to log into GNOME 40.4. But I got a random dpkg error, as follows:
Setting up vanilla-gnome-default-settings (20.04.1) ...
update-alternatives: error: no alternatives for gdm3-theme.gresource
dpkg: error processing package vanilla-gnome-default-settings (--configure):
installed vanilla-gnome-default-settings package post-installation script subprocess returned error exit status 2
dpkg: dependency problems prevent configuration of vanilla-gnome-desktop:
vanilla-gnome-desktop depends on vanilla-gnome-default-settings; however:
Package vanilla-gnome-default-settings is not configured yet.
dpkg: error processing package vanilla-gnome-desktop (--configure):
dependency problems - leaving unconfigured
No apport report written because the error message indicates its a followup error from a previous failure.
Errors were encountered while processing:
vanilla-gnome-default-settings
vanilla-gnome-desktop
E: Sub-process /usr/bin/dpkg returned an error code (1)
***SOLUTION***
sudo sed -i "s/gdm3-theme.gresource/gdm-theme.gresource/" /var/lib/dpkg/info/vanilla-gnome-default-settings.postinst && sudo apt -f install
sudo apt install vanilla-gnome-desktop
sudo apt install gnome gnome-session gnome-terminal
sudo apt update && sudo apt upgrade
sudo apt install ubuntu-gnome-desktop
sudo apt install ubuntu-desktop-minimal
sudo apt install tasksel
sudo tasksel
press SPACE, then TAB to switch to the OK/Cancel buttons, and press ENTER to OK.
***** Fix a broken gsettings setting without gnome-session ( gsettings set org.gnome.settings-daemon.plugins.xsettings overrides "[{'Gdk/WindowScalingFactor', <2>}]" ) ( gsettings set org.gnome.desktop.interface scaling-factor 0 )
****
mv .config/dconf/user{,-old}
exit
echo "user-db:user-old" >db_profile
DCONF_PROFILE=~/db_profile dconf dump / >old_settings
rm db_profile
mv .config/dconf/user-old ~/user-settings-delete-later
dconf load / <old_settings
👉 WINDOW DISPLAY ERROR && UNRENSPONSIVE MOUSE UBUNTU
dpkg -l | grep linux-image | awk '{print$2}'
sudo apt-get install linux-image-5.3.0-18-generic
OR BEST ALTERNATIVE
sudo apt-get update && sudo apt-get upgrade
sudoedit /etc/gdm3/custom.conf
->edit
remove # in wayland to enable it
sudo systemctl restart gdm3
👉 UPGRADING GNOME IN UBUNTU
It is possible to install GNOME 41 on Ubuntu 20.04. I followed these steps and they worked for me.
Note: All packages are still in the experimental phase. Installing the repositories mentioned below may break your system.
Instructions:
Install the devacom/gnome-41 and devacom/x11 repositories to fix gnome-shell dependencies:
sudo add-apt-repository ppa:devacom/gnome-41
sudo add-apt-repository ppa:devacom/x11
sudo apt-get update
Install gnome-shell before the upgrade (fix desktop will not load):
sudo apt install ubuntu-desktop
You may also want to install the Applications Menu or ArcMenu extensions using your browser.
Upgrade your system:
sudo apt upgrade
Reboot your system
👉 INCREASE SCREENCAST TIME TO 9800 SEC
gsettings set org.gnome.settings-daemon.plugins.media-keys max-screencast-length 9800
👉 GNOME TRANSPARENCY(MENU BAR AND TAB)
Install (Dynamic Panel Transparency ) in ubuntu
OR NEW
Install Transparent Shell for debian packages Transparent Top Bar for rpm packages
👉 GNOME TRANSPARENCY(DASH TO DOCK)
Install dash-dock extension
Then terminal write these commands:
-------------------------------------------
gsettings set org.gnome.shell.extensions.dash-to-dock transparency-mode 'FIXED'
gsettings set org.gnome.shell.extensions.dash-to-dock background-opacity 0.2
-------------------------------------------
👉 (ARCH && FEDORA) ENABLE PANTHEON DESKTOP ENVIRONMENT
sudo systemctl disable lightdm.service
sudo systemctl enable gdm.service
👉 (DEBIAN) ENABLE PANTHEON DESKTOP ENVIRONMENT
sudo systemctl disable lightdm.service
sudo systemctl enable gdm.service
systemctl start gdm
👉 (UBUNTU 20- ONWARDS )XFCE INSTALLATION
sudo apt install xfce4 xfce4-goodies
(OR)
sudo apt install xfce4
(OR)
sudo add-apt-repository ppa:xubuntu-dev/staging
sudo apt update
sudo apt upgrade
sudo apt install xubuntu-desktop
👉 REMOVE XUBUNTU
sudo apt-get autoremove --purge xubuntu-* && sudo apt-get autoremove --purge xfce*
sudo apt-get autoremove --purge $( dpkg-query -l *xubuntu* | grep ii | tr -s " " | cut -d" " -f2; dpkg-query -l *xfce* | grep 'ii' | tr -s " " | cut -d" " -f2 )
👉 CHANGE DOCK IN XFCE
The default dock is nice and pretty much does what it is for. But again, it looks a bit boring.
However, if you want your dock to be better and with a little more customization options, you can install another dock.
Plank is one of the simplest and lightweight docks and is highly configurable.
To install Plank use the command below:
sudo apt-get install plank
If Plank is not available in the default repository, you can install it from this PPA.
sudo add-apt-repository ppa:ricotz/docky
sudo apt-get update
sudo apt-get install plank
BHere’s how my screen looks after applying these steps.efore you use Plank, you should remove the default dock by right-clicking in it and under Panel Settings, clicking on delete.
Once done, go to Accessory > Plank to launch Plank dock.
👉 FINGERPRINT LOGIN DEBIAN => https://ubuntuhandbook.org/index.php/2024/02/fingerprint-reader-t480s/#google_vignette
INSTALL
sudo add-apt-repository ppa:uunicorn/open-fprintd
sudo apt install open-fprintd fprintd-clients python3-validity
sudo systemctl status python3-validity.service
sudo systemctl enable python3-validity.service
sudo systemctl start python3-validity.service
fprintd-enroll
sudo pam-auth-update
=> In next screen, use up/down arrow keys to choose “Fingerprint authentication”, then press space-bar to enable/disable it. Finally, press Tab to highlight OK and hit Enter.
sudo systemctl enable open-fprintd-resume open-fprintd-suspend
REMOVE
First, open terminal (Ctrl+Alt+T) and run command to re-configure PAM:
sudo pam-auth-update
In the screen, disable fingerprint authentication then, hit Tab and enter.
Next, run command to uninstall the python3-validity driver:
sudo apt remove --autoremove open-fprintd fprintd-clients python3-validity
Also, remove the Ubuntu PPA by running command:
sudo add-apt-repository --remove ppa:uunicorn/open-fprintd
👉 LINUX OPERATING SYSTEM DISTRIBUTIONS
@@@@@@@@@@@@
For Debian / Ubuntu: .deb packages installed by apt or apt-get and dpkg Package Management System
For Rocky / Fedora / RHEL: .rpm packages installed by yum for redhat, dnf for Fedora Package Management System
For openSUSE: .rpm packages installed by zypper Package Management System
For Arch: .aur packages installed by pacman Package Management System
For FreeBSD: .txz packages installed by pkg Package Management System
@@@@@@@@@@@@
ARCH (OPENSUSE TUMBLEWEED) => Install on KVM
FEDORA
DEBIAN
UBUNTU (Software Availability + [MX Linux 23.3 version in kvm-qemu on VIRTUAL MACHINE MANAGER for [Best Display] + [Ubuntu 20.04 version] for [High Display and Best Refresh Rate HZ] after upgrading from 18.04 version OR INITIAL Ubuntu 20.04 INSTALLATION // FLASH DRIVE Ubuntu 20.04 for Fixing Grub Boot Loader + [Ubuntu 22.04 version] for Best fingerprint support and Best performance)
----------
KALI PURPLE + KALI LINUX
^^^^^^^^^^^^^^^^^^^^^^^^
🧨🧨🧨🧨 KALI DISTRIBUTIONS FOR COMPLETE INSTALLATION OF TOOLS
Kali Linux Live/Installer 2023 VERSION
Kali Purple 2023 VERSION
^^^^^^^^^^^^^^^^^^^^^^^^
METASPLOITABLE
WINDOWS 10
----------
TAILS => Install on USB-NVME https://www.youtube.com/watch?v=gO9fTnMxwYw
Install these OS on USB-NVME:
KALI PURPLE + KALI LINUX
WINDOWS 10
👉 UBUNTU STUDIO APPS
sudo apt install ubuntustudio-controls ubuntustudio-installer ubuntustudio-desktop
👉 LINUX SOFTWARES
🧨🌀 SYSTEM HARDENING
Firewall -> Firewall app
Use latest Wifi/ Internet Security Standard e.g WPA2
Install Anonsurf => Cybersecurity installation section
Remove ssh, openssh, ftp, pkexec,policykit-1 => sudo apt remove openssh-client
sudo apt remove pkexec
sudo apt purge policykit-1
sudo apt remove smbclient
------------------------------------------------------------------------------------------------------
Chrome => Check below setup Official installation of Google Chrome
Firefox => SaveFrom.net helper extension, DuckDuckGo Privacy Essentials extension, Netcraft Extension
Vivaldi
Brave
Gnome Night Light OR Redshift
GNS3 -> Network Design
VirtualBox
QEMU/KVM
Visual Studio Code => Check below setup Official installation of Microsoft VSCode
Anaconda
Postgres + PGAdmin -> Dashboard
Couchdb + Couchdb Admin -> Dashboard
Surrealdb
Postman
Slack
Discord
Telegram
Chess
VLC
Rythmbox
Gnome Tweaks
-------------------------------------------------------------------------------------------------------
GNOME EXTENSIONS
-> Transparent Shell for debian packages Transparent Top Bar for rpm packages
-> Open weather
-> dash-to-dock Floating dock
-------------------------------------------------------------------------------------------------------
LINUX PRODUCTIVITY APPS A MUST INSTALLATION https://www.makeuseof.com/must-have-applications-for-linux-users/
🌀 Also check WINDOWS SECTION
Libreoffice
Mozilla Firefox
Gparted
Flameshot is a free screenshot tool.
KBackup is a free tool you can use to create a backup of your data.
FlareGet is a free download manager.
Shotcut is an open-source video editor.
Simplenote is an open-source note-taking application.
Ardour is a digital audio workstation.
OBS, or Open Broadcaster Software, is an open-source and free screen recording application
GIMP is an open-source, cross-platform image editing software with powerful features for handling images, graphics, 3D paintings, and artworks.
Inkscape is a vector editing and designing software for Linux and other OSes that offers powerful technical and artistic illustration tools.
Krita is an advanced painting app for creating digital artwork that equips you with various tools to sketch and paint on a digital playground.
Kdenlive allows you to create professional videos and is suitable for both basic and professional video editing.
*Transmission (uTorrent) is also suitable for your privacy as it doesn't collect any user or analytics data.
7-Zip is a pretty lightweight file archiver. Its small size enables it to provide much better performance. Moreover, it is free and open-source, unlike WinRAR, and is available for all major Linux distros.
EverSticky allows you to save quick notes on your desktop screen, making it a great utility for to-do lists, notes, and reminders. It is an advanced alternative to Sticky Notes.
Okular is a must-have app for handling documents, as it supports all major file formats such as PDF, ePUB, TIFF, Images, DVI, XPS, and ODT.
Clementine is a music player with a minimal, easy-to-use interface.
👉 LINUX A MUST CPU AND GPU DRIVERS INSTALLATIONS
NVIDIA => #Linux AMUST Installations Guide.docx file in drive
MUX => Configure between using nvidia and intel
AMD DRIVERS
AMDGPU = AMD open source kernel display driver.
AMDGPU-Pro = Proprietary display driver and API implementations.
MESA = Open source implementations of various APIs such as OpenGL.
Radeon-Vulkan = Open source MESA implementation of Vulkan.
AMDVLK = Open source vulkan components of AMDGPU-Pro driver.
AMDGPU + MESA + Radeon-Vulkan is what is most commonly used by gamers because it has the best game compatibility and performance.
MESA INSTALLATION
$ sudo lshw -c video OR sudo lspci -nn | grep -e VGA
$ sudo apt install mesa-utils
$ glxinfo -B
$ sudo apt install -y gpg-agent wget
$ wget -qO - https://repositories.intel.com/graphics/intel-graphics.key |
sudo apt-key add -
$ sudo apt-add-repository \
'deb [arch=amd64] https://repositories.intel.com/graphics/ubuntu focal main'
$ sudo apt update
$ sudo apt install wine
$ sudo apt install libwine
$ sudo apt install wine64
$ sudo apt install libva2
$ sudo apt install \
intel-opencl-icd \
intel-level-zero-gpu level-zero \
intel-media-va-driver-non-free libmfx1
$ sudo apt install \
libigc-dev \
intel-igc-cm \
libigdfcl-dev \
libigfxcmrt-dev \
level-zero-dev
👉 FEDORA
Update & Upgrade (DO FIRST):
sudo dnf upgrade --refresh
Enable RPM Fusion:
sudo dnf install https://mirrors.rpmfusion.org/free/fe... -E %fedora).noarch.rpm https://mirrors.rpmfusion.org/nonfree... -E %fedora).noarch.rpm
Install NVIDIA Drivers:
https://rpmfusion.org/Howto/NVIDIA
Install Favorite Apps:
sudo dnf install vlc gimp gparted kdenlive
Install GNOME Tweak Tool:
sudo dnf install gnome-tweak-tool
Install Timeshift backup:
sudo dnf install timeshift
Install Preload:
sudo dnf copr enable elxreno/preload -y && sudo dnf install preload -y
Firefox Tweaks:
about:config
layers.acceleration.force-enabled
gfx.webrender.all
Change DNS:
8.8.8.8,8.8.4.4
Speed up DNF:
echo 'fastestmirror=1' | sudo tee -a /etc/dnf/dnf.conf
echo 'max_parallel_downloads=10' | sudo tee -a /etc/dnf/dnf.conf
Install DNFDragora:
sudo dnf isntall dnfdragora
Install GNOME Extensions:
dnf install chrome-gnome-shell gnome-extensions-app
Install KDEConnect:
sudo dnf install kdeconnectd
Install Steam:
sudo dnf install steam
Better Fonts:
sudo dnf copr enable dawid/better_fonts -y
sudo dnf install fontconfig-font-replacements -y
sudo dnf install fontconfig-enhanced-defaults -y
Install Bleachbit:
sudo dnf install bleachbit
Install Preload:
sudo apt install preload
Improve Laptop Battery:
sudo dnf install tlp tlp-rdw
Just run the above command and you don’t need to do anything else. It’ll make your laptop battery last longer by implementing some power-saving protocols. Sorry, I forgot this one in the video.
sudo tlp start
👉 UBUNTU
Update & Upgrade:
sudo apt update; sudo apt upgrade
Install Favorite Apps:
sudo apt install vlc gimp gparted synaptic
Install Ubuntu Restricted Extras (Media Codecs):
sudo apt install ubuntu-restricted-extras
Install Timeshift Backup Tool:
sudo apt install timeshift
Install Preload:
sudo apt install preload
FIREWALL -> Firewall Configuration --- Turn on
CPU PERFORMANCE -> cpupower-gui
Improve Laptop Battery:
sudo apt install tlp tlp-rdw
Just run the above command and you don’t need to do anything else. It’ll make your laptop battery last longer by implementing some power-saving protocols. Sorry, I forgot this one in the video.
sudo tlp start
Install nala and Specify Top 3 Ubuntu Mirror Downloads
sudo nala fetch
Firefox Tweaks:
about:config
layers.acceleration.force-enabled
gfx.webrender.all
Change DNS:
8.8.8.8,8.8.4.4
Install Tweak Tool:
sudo apt install gnome-tweak-tool
GNOME Extensions:
https://extensions.gnome.org/
👉 SECURE SSH ON LINUX
1. Disable Root Login
This setting should be set by default, this makes it to where all root logins via SSH are unallowed. System administrators would log in via their user account and escalate via su or sudo commands. To verify this setting or modify it on your own, you can run nano /etc/ssh/sshd_config and set the PermitRootLogin parameter to no if it is not already done. To make sure the setting is applied, you’ll need to restart the service using systemctl restart sshd or service sshd restart.
2. Disable SSH Protocol 1
SSH Protocol 1 should be disabled by default on newer versions of SSH and Linux. However, if you run an older version of either, ensure that only Protocol 2 is being used. Use nano /etc/ssh/sshd_config to edit the configuration file and look for the Protocol parameter and set it to 2.
3. Disable Host-Based Authentication
Host-based authentication is one method of controlling who and how someone can authenticate to your Linux server via SSH. It’s also not recommended; consider a machine that is compromised but is allowed access to your server. Threat actors now have the ability to log in via SSH to your Linux server thanks to host-based authentication.
The Center for Internet Security (CIS) playbook recommends to lock down this setting in favor of using public key-based authentication. To do this, run nano /etc/ssh/sshd_config to edit the SSH configuration file. Find the parameter HostBasedAuthentication and set it to no. Keep in mind that this setting applies only to SSH using Protocol 2, so make sure you are using this protocol version as mentioned above.
You can also ensure that setting the parameter IgnoreRHosts is set to yes to require your users to authenticate via an approved method. This parameter tells your SSH configuration to ignore the .rhosts and .shosts file.
4. Disable Empty Passwords
Make sure that all users authenticating with your Linux server via SSH have to use some form of authentication rather than an empty password string. To do this, type nano /etc/ssh/sshd_config into the command line, find the parameter PermitEmptyPasswords and ensure this is set to no.
5. Enable and Use Key-Based Authentication
Public key-based authentication is a great method for users to authenticate to your Linux servers using SSH. You’ll need to be able to generate key pairs (public and private key pairs) and lock down or at least audit changes to the home directory in which these key pairs are saved on your server.
Ideally you should both audit and lock down said folder. By default this folder is located here ~/.ssh/. Lock the folder down to just root and wheel using chown root:wheel ~/.ssh/id_rsa and chown root:wheel ~/.ssh/id_rsa.pub. Reviewing audit logs can help you ensure that no unauthorized changes are being made.
Generating SSH keys for this type of authentication is out of scope for this blog post, but you can accomplish this by using the ssh-keygen command. You should also use a longer bit length key like RSA 4096 or higher (4096 is the smallest recommended key length).
6. Enforce a Limit to the Maximum Number of Authentication Attempts
Enabling a limit to the maximum number of authentication attempts through SSH mitigates the risk of a successful brute force attack within your environment. This is an easy control to put in place as it is simply editing a parameter within your SSH configuration file. To do this, use nano /etc/ssh/sshd_config, find the parameter MaxAuthTries and set it to a more appropriate value. The recommendation from CIS is four attempts or less.
7. Enforce an Idle Timeout
Leaving SSH sessions logged in and idle or unattended without a control can increase your attack surface. If an attacker were to gain access to a machine where an unattended SSH session was left open, it would be too easy for them to unleash a whole host of nefarious actions against your Linux SSH server.
To mitigate this risk, use nano /etc/ssh/sshd_config to adjust the parameters for ClientAliveInterval. This value is in seconds and determines the actual length of time before a timeout. A value of 300 or less is recommended for a five-minute session timeout.
8. Disable Users’ Ability to Set Environment Variables
By default OpenSSH in a Linux system allows users the ability to set or modify environment variables within their sessions. This setting is not advised since it could allow a user to bypass the controls you’ve put in place. For example, a user can set a variable to have a process execute a malicious package. To stop any potential malicious action in SSH, set the parameter PermitUserEnvironment to no in /etc/ssh/sshd_config.
9. Audit Changes to Keys and to SSHD_Config
Auditing changes to your SSH keys and to your SSH configuration file are extremely important. You don’t know what you can’t see, and visibility is a key component of any security program. Knowing when changes occurred, how they occurred, and who administered those changes — authorized or otherwise — should be a priority. You can use tools such as auditd, ssh-audit, and more to maintain that visibility.
10. Enforce SSH to Use PAM
Within the context of Linux, PAM is Pluggable Authentication Modules, and should be enabled for SSH by default. You can check and set whether PAM is enabled for SSH by looking for the UsePAM parameter and making sure that it is set to yes. This will allow you to disable host-based authentication in the PAM configuration, as well as edit other security settings such as keyboard authentication and even multi-factor authentication.
11. Restrict SSH Access to Users/Groups That Need It
Keeping in mind the principle of least privilege, you should narrow the scope of who needs to connect via SSH into your servers. Managing the groups and users who can have this ability is easily accomplished in Linux.
One way is to access the AllowUsers and AllowGroups parameters within the /etc/ssh/sshd_config file. Then, populate the parameters with a list of the users and groups that exist on the server who should have access to SSH into your server. You can also leverage the DenyUsers and DenyGroups parameters to explicitly call out who cannot connect via SSH into your server.
12. Disable X11 Forwarding
With X11 forwarding, a remote user can access a graphical application installed on a Linux server. If left unchecked, hackers can use it to monitor other users’ keystrokes or perform malicious activity by opening other X11 connections to remote clients. It’s recommended to disable this. The parameter X11Forwarding should be set to no in the SSH configuration file. You can do this by running nano /etc/ssh/sshd_config and then modify the parameter. Restart your server or at least connect via SSH to make sure your configuration changes are applied.
13. Disable Port Forwarding
Local port forwarding via SSH can be used legitimately inside of an environment. Leaving this configuration on, however, can leave you exposed to a threat actor creating their own backdoor tunnels into your environment.
It is recommended that you disable this setting, which you can do inside of the SSH configuration file at this location /etc/ssh/sshd_config. The parameter for this particular setting is called AllowTcpForwarding, and setting this to no will prevent users from being able to exploit this feature in your environment.
14. Use UFW or Firewalld to Restrict Access
Uncomplicated firewall (UFW) and Firewalld — open source programs that manage firewalls within Linux — are not enabled by default on their respective distributions, or distros. However, you can use them to further lock down which SSH connections are allowed and from what networks or IP addresses you want connections to come from. Make sure to remove less restrictive rules that may already be in place before adding in your own rules.
15. Limit the Maximum Number of Concurrent Sessions
Restricting the maximum number of open SSH sessions adds an extra layer of security by further hardening your SSH configurations. This can also mitigate a denial of service attack (DoS), especially if you know how many users are expected to be on the same server at the same time. The CIS recommendation is to limit this to a maximum of four concurrent connections. You can do this by running nano /etc/ssh/sshd_config and modifying the parameter maxsessions to four or less, depending on your needs.
16. Change Log Level of SSH in Linux
Ensuring that SSH is logging appropriately can help find potential bad actors. It’s important to have just the right amount of information, as this will make it easier to find the key security information that you need.
The default log level, INFO, records only user login activity and can leave you wanting for more information, especially if you are in the middle of responding to an incident. VERBOSE is preferred as it will detail logout time in addition to login time and other pertinent information. To change this, edit the LogLevel parameter and change the value from INFO to VERBOSE. You can do this by running nano /etc/ssh/sshd_config to edit the SSH configuration file.
17. Set a Warning Banner
Banner messages can be a deterrent for threat actors, since it places a stern warning to users who connect to your machine via SSH that their access is being logged and any unauthorized use will be tracked and prosecuted (in some environments this last part is of particular import). While this may not stop all potential attackers, it could deter some who see this and decide against proceeding.
To set a warning banner, edit your SSH configuration file at /etc/ssh/sshd_config and modify the Banner parameter to show the directory where to place your banner file with your warning message.
18. Change the Default Port
Security through obscurity is not true security; changing the default SSH port will not make your server more secure. What it will do is prevent it from being easily discovered when port scanning an environment. Avoiding using 22 or some other derivative like 2222 will likely skate by some lower-skilled bad actors.
To make this modification, edit the SSH configuration file using nano /etc/ssh/sshd_config and find the parameter called Port. This will likely be commented out so it will appear like # Port 22. Remove the # and change the port number from 22 to another port.
Restart the SSH service. If you made changes to your UFW or Firewalld configuration, make sure the new port is reflected in your rules to ensure you still can connect to the server via SSH on the new port.
🌀 Install Blumira: Security for IT Teams
👉 OPENSSH IN OPENSUSE
1. Refresh zypper's package list.
> sudo zypper refresh
2. Install openssh package if not already installed (optional).
> sudo zypper install --no-confirm openssh
3. Start sshd service via systemd.
> sudo systemctl start sshd
4. Check if sshd is successfully started (optional).
> sudo systemctl status sshd
5. Enable sshd during system startup.
> sudo systemctl enable sshd
6. Enable firewall rule for ssh.
> sudo firewall-cmd --permanent --add-service=ssh
7. Reload firewall rules.
> sudo firewall-cmd --reload
👉 ssh file transfer tools: scp and rsync.
We can use both of these tools to transfer files between Linux machines. However, we should consider scp when the file size is small — it’s a simple copy tool and is widely available. In our day-to-day use, scp is the easy choice for small, one-off transfers.
We would benefit most from rsync when file size is big or when we need more complex and efficient synchronization. It’s also the better choice for recurring tasks like cron jobs and scripts.
===============================================================================================================================================
👉 COMPUTER NETWORKS FUNDAMENTALS
-----------------
IP Adrress
Subnetting
Router
-----------------
NAT
Firewall + Firewall Types
VPN Client and VPN Server + VPN Tunnel -> Provides Anonymity(IP address and Location) and Security(Data and Online activity)
Dimilitarized zone
Port Forwarding
-----------------
What is IP addressing? How IPv4 works| ipv4 vs ipv6 | 5 types of ip classes | public vs private ip -> https://www.youtube.com/watch?v=aSZuzkC24XU
What is subnetting How subnetting works What is subnet mask Explained with real-life exmples -> https://www.youtube.com/watch?v=qulRjRFavJI
7 OSI layers explained with real examples | osi vs tcp/ip model -> https://www.youtube.com/watch?v=sCYkeo466Qs
How VPN really works? -> https://www.youtube.com/watch?v=8wDI5_MVdmk
proxy vs vpn - Which is better ? -> https://www.youtube.com/watch?v=OI-7MplHwi0
What is the future of load balancing? -> https://www.youtube.com/watch?v=n-E4O8xnDQM
👉 NETWORK IP ADDRESSES AND SUBNETTING
What are the IP address classes and their corresponding subnet masks?
IP subnetting
Empty Cell Subnet Mask Number of Supported Hosts per Network
Class A 255.0.0.0 Over 16 million
Class B 255.255.0.0 Over 16 thousand
Class C 255.255.255.0 254
👉 NETWORK COMMANDS => sudo apt install net-tools
ipconfig /all -> Windows ifconfig -> Linux
nslookup www.google.com -> to get ip address
OR nslookup ip address -> to get domain name
ping www.google.com
tracert www.google.com
netstat -a
nmcli connection add con-name 'Ethernet' ifname enp9s0 type ethernet
sudo netdiscover
rfkill
sudo service NetworkManager start
sudo service NetworkManager stop
👉 MICROTIK
startx
service gdm3 restart
👉 MICROTIK ROUTER CONFIGURATION
Default Network IP Address -> 192.168.88.1
Microtik Routers MAC Addresses
NEW C4:AD:34:B3:41:33
OLD CC:2D:E0:62:46:38
C4:AD:34:B3:41:30
👉 PPOE
PPoE user 0711444518
PPoE password 0000
IP address 10.5.10.52
Gateway 10.5.10.1
tick Firewall Router
Local Network
IP address 192.168.88.1
Netmask 255.255.255.0(/24)
tick DHCP Server
DHCP Server Range 192.168.88.10.192.168.88.254
tick NAT
VPN
VPN address 8ce5083f2021.sn.mynetname.net
👉 NETWORK ADMINISTRATION WINDOWS
https://medium.com/@adinika.15/installing-active-directory-on-windows-server-2012-r2-e9e614770588
👉 Assigning static ip address to the local network
👉 LINUX
((( ifconfig OR ip a )))
sudo ifconfig enp9s0 192.168.92.5 netmask 255.255.0.0
OR sudo ifconfig eth0 192.168.92.5 netmask 255.255.0.0
sudo route add default gw 192.168.92.1 enp9s0
OR sudo route add default gw 192.168.92.1 eth0
ping 192.168.92.5
👉 WINDOWS
1)To set a static IP address in Windows 7, 8, and 10:
Click Start Menu > Control Panel > Network and Sharing Center or Network and Internet > Network and Sharing Center.
Click Change adapter settings.
Right-click on Wi-Fi or Local Area Connection.
Click Properties.
Select Internet Protocol Version 4 (TCP/IPv4).
Click Properties.
Select Use the following IP address.
Enter the IP address, Subnet mask, Default gateway, and DNS server.
Click OK.
Your computer displays a static IP address.
2)Turn off firewall if pentesting
👉 LINUX -> Assigning dynamic ip address to the local network
sudo dhclient enp9s0
----------------------------------------------------------
DHCP
sudo apt install isc-dhcp-server
sudo nano /etc/dhcp/dhcpd.conf
EDIT THIS FILE
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
default-lease-time 600;
max-lease-time 7200;
ddns-update-style none;
authoritative;
subnet 192.168.92.0 netmask 255.255.255.0{
range 192.168.92.0 192.168.92.100;
option routers 192.168.91.1;
option subnet-mask 255.255.255.0;
option domain-name-servers 192.168.92.1, 8.8.8.8;
}
host Windows7{
hardware ethernet 00:0c:29:e6:75:b9;
fixed-address 192.168.92.7;
}
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
sudo systemctl restart isc-dhcp-server
sudo systemctl status isc-dhcp-server
👉 RESOLVED BLOCKED WIFI BY VPN(Wifi connection without internet access) IN LINUX
=> STOP VPN FROM RUNNING e.g sudo anonsurf stop
cd /etc/
sudo nano resolv.conf
--------------------------------------------
#nameserver ::1
nameserver 8.8.8.8
nameserver 8.8.4.4
-----------------------------------------------
===============================================================================================================================================
🧨🧨🧨 BIOS 🧨🧨🧨
1st===> Long press Power Button for 1 minute to kill voltage in the mother-board semiconductors. Then start machine.
OR
2nd===> Long press [Windows Button + B] .Then press Power Button to start the machine.
👉 CHANGE ROOT PASSWORD
sudo passwd root
👉 SUSPEND LAPTOP
Power Button Behaviour -> Suspend
👉 QUICK SOLUTION KERNEL OS OPERATIONS
==> OFFLINE
Restart/Reboot
sudo dpkg --configure -a
sudo apt --fix-broken install
----------------------------------------------------
==> ONLINE
sudo apt update
sudo apt upgrade
👉 KILL A SYSTEM PROCESS
ps aux | grep -i dpkg
sudo kill 1977
sudo kill -9 1977 => FORCE KILL
👉 DELETE A LOCKED SYSTEM PROCESS
sudo rm /var/cache/apt/archives/lock
sudo dpkg --configure -a
sudo apt clean
👉 LINUX CHECK SERIAL NUMBER
sudo dmidecode -s system-serial-number
👉 BYPASSING SECURITY FOR A SYSTEM AUDITOR
-> Bypass Label Processing (BLP)
-> System Exits
-> Special System Login IDs
👉 CHECK BATTERY
upower -i /org/freedesktop/UPower/devices/battery_BAT0
👉 REMOVE DOT FILE/FOLDER
rm -rf ~/.<file or folder name> (i.e)
rm -rf ~/.ssh
👉 REMOVE BINARY FILE
sudo rm -rf /usr/local/bin/surreal
👉 RECONFIGURE BROKEN DEPENDENCIES
sudo dpkg --configure -a
sudo apt autoclean
sudo apt clean
sudo apt autoremove
sudo dpkg --purge <file-name>
sudo apt-f install
sudo apt --fix-broken install
sudo apt update && sudo apt upgrade
👉 MAINTENANCE OF SSD
sudo systemctl enable fstrim.timer
systemctl status fstrim.timer
👉 GRUB RESCUE ERROR FIX UBUNTU
sudo update-grub2
sudo grub-install /dev/sda
sudo add-apt-repository -y ppa:yannubuntu/boot-repair
sudo apt-get update
sudo apt-get install -y boot-repair && boot-repair
👉 GRUB UNABLE TO DETECT OTHER OPERATING SYSTEMS
Install os-prober (usually installed already)
Edit the /etc/default/grub file to add GRUB_DISABLE_OS_PROBER=false to this file.
Save this file.
Update grub (https://itsfoss.com/update-grub/) and Grub should see other operating systems now.
👉 FLASH DRIVE Ubuntu 20.04 for Fixing Grub Boot Loader
1 Install Grub
sudo add-apt-repository -y ppa:yannubuntu/boot-repair
sudo apt-get update
sudo apt-get install -y boot-repair && boot-repair
2 Select Advanced Options
3 Grub Location
Select OS to boot from
4 Edit Grub configuration file
5 Select Apply and OK
👉 DELETE OLD DEPENDENCIES AND FILES
sudo apt-get autoremove
sudo apt-get autoclean
OR BEST ALTERNATIVE Check FILE HISTORY on Settings and Clear (File History, Trash and Temporary Files)
👉 (PHONE)TERMUX
pkg install lynx
termux-setup-storage
👉 Install Genymotion (ANDROID EMULATOR)
sudo apt update
sudo apt install virtualbox
Download Genymotion from the official website of Genymotion
ls -l ~/Downloads/
chmod +x ~/Downloads/genymotion-3.0.2-linux_x64.bin
sudo ~/Downloads/genymotion-3.0.2-linux_x64.bin
👉 REMOVE
sudo rm -rf
👉 CONFIGURE
sudo dpkg --configure -a
👉 FIX FLASH ON UBUNTU TERMINAL
sudo fdisk -l
1st option
sudo mkfs.msdos -f 32 /dev/sdb
DEFRAGMENT sudo e4defrag /dev/sda7
2nd option
udisksctl power-off -b /dev/sdb
👉 CHECK SYSTEM INFO
lscpu
👉 CHECK NUMBER OF CORES AND THREADS
lscpu | grep -E '^Thread|^Core|^Socket|^CPU\('
👉 CHECK OS VERSION USING PYTHON
python -m platform
python3 -m platform
👉 CHECK OS VERSION
hostnamectl
👉 CHECK OS VERSION IN DEBIAN
lsb_release -d
lsb_release -a
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
-> MINIKUBE username: “docker“, password: “tcuser“
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
👉 VIRTUALBOX INSTALLATION
👉 NOT CONNECTING TO THE INTERNET -> sudo service libvirtd restart
INSTALL VIRTUALBOX IN UBUNTU
sudo apt install virtualbox
FIXING 'modrobe' ERROR
Disable SecureBoot Option in Bios
OR
Uninstall virtualbox-dkms and its configurations by running the below commands:
sudo apt-get remove virtualbox-dkms
sudo apt-get remove --purge virtualbox-dkms
Install Linux headers and Linux image by running the below command:
sudo apt-get install -y linux-headers-amd64 linux-image-amd64
Install virtualbox-dkms
sudo apt-get install -y virtualbox-dkms
OR
As the first step run this command:
sudo modprobe vboxdrv
If it's not helpful run these commands:
sudo apt update
sudo apt install --reinstall linux-headers-$(uname -r) virtualbox-dkms dkms
Then reboot your system and after reboot run this command:
sudo modprobe vboxdrv
👉 QEMU/KEMU + VIRTUAL MACHINE MANAGER
👉 NOT CONNECTING TO THE INTERNET -> sudo service libvirtd restart
1. https://www.youtube.com/watch?v=vyLNpPY-Je0&list=PLqKkf220xDDbW9G0a4XXO50yvHWFq2uOD&index=1 -> qemu/kvm & virt-manager installation on Ubuntu 22.04 (Installation of Linux)
--->>> Check Windows Doc File
sudo apt install qemu-kvm libvirt-bin bridge-utils virt-manager qemu-img firewalld ebtables dnsmasq qemu-utils gir1.2-spiceclientgtk-3.0
sudo apt install libvirt
conda deactivate
sudo apt install libvirt
sudo apt install libvirt-bin
sudo apt install gir1.2-spiceclientgtk-3.0
sudo apt install qemu-utils
sudo apt install bridge-utils virt-manager qemu-img firewalld ebtables dnsmasq qemu-utils gir1.2-spiceclientgtk-3.0
sudo apt install bridge-utils virt-manager firewalld ebtables dnsmasq qemu-utils gir1.2-spiceclientgtk-3.0
sudo apt install dnsmasq
sudo apt install bridge-utils virt-manager firewalld ebtables dnsmasq qemu-utils gir1.2-spiceclientgtk-3.0
sudo apt install libvirt-bin
sudo apt install libvirt
sudo apt-get install libvirt
sudo apt install qemu-kvm libvirt-daemon-system
sudo adduser $USER libvirt
sudo apt install libvirt-bin
sudo apt install libvirt
sudo apt install multipass
sudo snap install multipass
sudo apt -y install uvtool
sudo apt install virt-viewer
sudo apt install virtinst
sudo apt install libvirt
sudo apt install libvirt-bin
sudo apt install bridge-utils virt-manager firewalld ebtables dnsmasq qemu-utils gir1.2-spiceclientgtk-3.0
AND
https://www.youtube.com/watch?v=2WDASJ0ye0A -> Windows 10 on KVM (Installation of Windows) Overview + Better Display Option
https://www.youtube.com/watch?v=jLRmVNWOrgo -> Creating Virtual Machines in QEMU | Virt-manager | KVM (Installation of Windows) Detailed
2. https://www.youtube.com/watch?v=DYpaX4BnNlg&list=PLqKkf220xDDbW9G0a4XXO50yvHWFq2uOD&index=2 -> qemu/kvm qemu/kvm Bridge and NAT networking
3. https://www.youtube.com/watch?v=1SDvth66i-4&list=PLqKkf220xDDbW9G0a4XXO50yvHWFq2uOD&index=3 -> qemu/kvm Snapshots tutorial
4. https://www.youtube.com/watch?v=0Ny0__uTGfM&list=PLqKkf220xDDbW9G0a4XXO50yvHWFq2uOD&index=4 -> qemu/kvm How to migrate a VM to another computer
5. https://www.youtube.com/watch?v=UcF4JoFqd9E&list=PLqKkf220xDDbW9G0a4XXO50yvHWFq2uOD&index=5 -> qemu/kvm DD tutorial - how to clone, backup and restore disks and partitions
6. https://www.youtube.com/watch?v=N7wvqHvmSN8&list=PLqKkf220xDDbW9G0a4XXO50yvHWFq2uOD&index=6 -> qemu/kvm How to enlarge Qemu/kvm virtual disk
********** ONLY ON LINUX VMS **********
7. https://www.youtube.com/watch?v=9FBhcOnCxM8&list=TLPQMDgxMjIwMjLiC7C3yo4mlw&index=10 -> 3 POINTS Below
► 0:00 Intro
► 2:25 Connecting USB devices
► 7:11 Setting up shared folders
8. Port Forwarding in KVM https://www.cyberciti.biz/faq/kvm-forward-ports-to-guests-vm-with-ufw-on-linux/
(CHECK WINDOWS DOC FILE) IN DRIVE
NOTE: Port Forwarding Basics in Ubuntu Linux https://www.cyberciti.biz/faq/howto-configure-setup-firewall-with-ufw-on-ubuntu-linux/
(CHECK WINDOWS DOC FILE) IN DRIVE
===============================================================================================================================================
CYBER SECURITY ===> CISM CERTIFICATION www.isaca.org/cismapp -> In order to be CISM Certified
👉👉👉 Keep Notes => Documentation of Commands and Scripts
=> Overview
🌀 The PC SEcurity Channel
=>Intermediate
🌀 NULLBYTE(Red Team)
🌀 DAVID BOMBAL(Both)
-------------------------------------
🌀 HACKERSPLOIT(Both)
=>Advance
🌀 PwnFunction 🌀 https://pwning.tech/ https://github.com/Notselwyn/
🌀 John Hammond
🌀 Low Level Learning
🌀 Crow
-------------------------------------
🌀 Try Hack Me => https://www.youtube.com/c/AlmondForce
🌀 Hack the Box => https://www.youtube.com/c/ippsec
🌀 Hack the Box => https://www.youtube.com/c/CryptoCat23
🌀 Live Overflow
🌀 Cyber Mentor
🌀 IppSec
🌀 STOK
🌀 Nicolas Gregoire
👉 Objective: Networks, API, Database and Endpoint Detection Response[Protection|Detection(Sniffer Analyser)|Response(BlueTeam and RedTeam)]Artificial Intelligence System
🌀 Interest in research: http,tcp,udp
🧨🧨 Implementation and Tools: ssh,ftp,samba,telnet,linux os, forensics
🧨🧨 Controls: Two factor Authentication + Very Strong Pasword/Keys Hash + Key Hash Logins to Secure Hard drive Partition + USB Key Hash Login
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
🧨 Endpoint Detection Response[Protection|Detection(Sniffer Analyser)|Response(BlueTeam and RedTeam)]Artificial Intelligence System
1. Data Cleaning
2. ETL
3. Data Pipeline
4. CI/CD
------------------------
🧨 OWASP
Networks Web[HTML,CSS,Pyscrpt] +API Database
https + Domain XML SQL
Bluetooth RestAPI NOSQL
tcp,udp GraphQL
Websockets: Django,FastAPI,Flask
------------------------
🧨 Hack The Box[ADVANCE]/Try Hack Me[BASIC]
Networks API Database
https + Domain XML SQL
Bluetooth RestAPI NOSQL
tcp,udp GraphQL
Websockets: Django,FastAPI,Flask
------------------------
🧨 Network protocols(Samba, telnet,ssh) + Linux OS [Bash, File System] + C Memory Management + Forensics
===> Remove Remote Connection [ssh, openssh, smbclient, active directory, PolicyKit-1 and pkexec, etc] in Linux and [active directory] in Windows
===> Download from Official Websites to avoid malicious software AND Delete default system utility or software
===> BROWSER, OS, [WEB, DESKTOP, MOBILE APPLICATION] => UPDATE & UPGRADE
1 Use Firewall Hardware and Software
2 Use latest Wifi/Internet Security standard e.g WPA2 <===> CONSTANTLY CHANGE Router [STRONG Password]
3 Use Antivirus
4 Use software to remove tracking e.g WPD -> Remove logs and DISABLE WINDOWS UPDATE
5 ======> Encrypt Partition Drive
======> If using Kali on BARE METAL, Remove storage devices containing DATA
6 Anonsurf (anonsurf-tool) for MAC ADDRESS PROTECTION and NETWORK DEFENSE PROTECTION
7 Auto-tor (anonymous-browsing-tool) for BROWSER PROXY <=> IP ADDRESS PROXY
******************************************************************************************************************************************************************************************
XML
- No APIs, Docs, PDFs
--------------------------------------------------------------------------------
- Basic html index page
APIs (JSON, Python Frameworks, GraphQL) => UPDATE & UPGRADE
- Use Authorisation Control
FRONTEND (HTML, CSS, Pyscript) => UPDATE & UPGRADE
- User Interface + Interactivity
- Dynamic Templates + URLS on [Basic] html pages
BACKEND (Python Frameworks) + (Python) => UPDATE & UPGRADE
- No Dynamic Templates + URLS on [User inputs]
--------------------------------------------------------------------------------
- CSRF Token original + Encrypted Randomization after [User Form Authorisation]
- Dynamic Templates + URLS on html pages after [User Form Authorisation]
SQL INJECTION
Limit user input name + consider it as String[Input Validation]
Limit user password + consider it as String[Input Validation]
🧨🧨🧨 SOFTWARE ENGINEERING BEST PRACTICES
Search Engine Optimization => XML
Frontend => HTML, HTMX, CSS, Pyscript
Data => CSSRF, JSON
Templating => Both Frontend and Backend
Backend => Django, Flask, fastAPI
Backend API => Django, Flask, fastAPI
Backend Serialization and Deserialization => Django, Flask, fastAPI
************************************************************************************************************************************************************************************
🧨🧨🧨 CYBERSECURITY BEST PRACTICES IN SOFTWARE ENGINEERING
Search Engine Optimization => XML must receive security updates and upgrades + used only in links of URL Domains
Frontend => HTML, HTMX, CSS, Pyscript must receive security updates and upgrades + programming languages must be strongly-typed
Data => CSRF, JSON (Automatically secured when programming languages and frameworks receive security updates and upgrades) + both must use randomized signatures + csrf token login must have time limit then it must expire
Dynamic Templating => Both Frontend and Backend must receive security updates and upgrades + must only be used in page text details or images + NOT data connected to the database e.g user inputs
Backend => Python, Django, Flask, fastAPI must receive security updates and upgrades + must use packages, libraries supported and maintained by the framework organization
Backend API => Python, Django, Flask, fastAPI must receive security updates and upgrades + must use strong and highly secure security key + security key must NOT be stored within the application folder
Backend Serialization and Deserialization => Python, Django, Flask, fastAPI must receive security updates and upgrades + both Serialization and Deserialization must occur in server side and ouput transferred to client-side + server must NOT
receive serialized data from client-side to deserialize or server must NOT receive deserialized data from client-side to serialize
👉 CYBERSECURITY WEBSITES => Google search site:tesla.com filetype:csv or filetype:pdf
www.shodan.com -> Identify vulnerable website/ webapp/ mobile app
www.dehashed.com -> Identify cracked passwords
nmap tool, sublist3r, owasp amass -> Information gathering
www.builtwith.com, wapplyzer -> Identify technology used in a website/ webapp
👉 VULNERABLE MACHINES
Metasploitable -> Linux
Kioptrix from Vulhub Website -> Linux
vulnserver -> On Windows machine
immunity debugger -> On Windows machine
👉 VIRTUALBOX CONFIGURATIONS
RAM 9096
STORAGE 200.00GB
Select iso from Storage on Right-Bottom Menu
👉 KALI VIRTUALBOX INSTALLATION
Go to preferences, select kali virtualbox file and select the configuration options. Then proceed with installation.
👉 METASPLOITABLE INSTALLATION
Follow the usual virtual hardisk installation. But don't create a new virtual hardisk, use an already existing virtual hardisk.
-------------------------------------------------------------------
LINUX OR WINDOWS Depending on Guest or Host
🧨 VIRTUALBOX GUEST ADDITIONS ISO + Install http://download.virtualbox.org/virtualbox/5.0.24/VBoxGuestAdditions_5.0.24.iso
ADDING GUEST EDITION ISO TO ENLARGE THE SCREEN + OTHER FEATURES
TERMINAL OPTION
# su
# apt-get update && apt-get install -y linux-headers - $ (uname -r)
# cp /media/cd-rom/VBoxLinuxAdditions.run / root /
# chmod 755 /root/VBoxLinuxAdditions.run
# cd / root
# ./VBoxLinuxAdditions.run
UI OPTION
🧨 VIRTUALBOX EXTENSION PACK HOST -> Oracle_VM_VirtualBox_Extension_Pack-6.1.26.vbox-extpack
->Go to File, Preferences, Extensions, Add Oracle_VM_VirtualBox_Extension_Pack
🧨 ACCESSING THE HOST FROM GUEST
->Go to Settings, Shared Folders, Machine Folders, Choose Folder Path e.g Downloads, Select Auto-mount and (Make-permanent will appear afterwards)
🧨 CONNECTING VMS TO A COMMON NETWORK
->Go to File, Preferences, Network, Add NAT Network (e.g)Network Name: NATNetworkPentesting Network CIDR: 10.0.2.0/24 --> You can also do Port Forwarding
✅ KALI TOOLS SOURCE CODES ------> GOOGLE DRIVE ====> [email protected] ------> KALI FOLDER
✅ KALI TOOLS SOURCE CODES ------> GITHUB ====> [email protected] devopsengineerDan ------> Cybersecurity Repositories
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
✅ KALI TOOLS DESCRIPTION AND FUNCTIONS ------> https://www.kali.org/tools/all-tools/
✅ KALI TOOLS DESCRIPTION ------> GITLAB ====> (GITHUB AUTHENTICATION [email protected] https://github.com/DancunManyinsa) ------> GITLAB https://gitlab.com/DancunManyinsa/all-kali-tools
👉 INSTALL KALI PURPLE THEN INSTALL KALI LINUX PENTESTING TOOLS
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
👉 KALI NETHUNTER - ANDROID - MOBILE
Install Dependencies In Termux pkg update -y && pkg install wget -y
Download Script In HOME wget https://raw.githubusercontent.com/rc-chuah/Kali-Nethunter-In-Termux/master/install-nethunter-full-termux
Give Execution Permission chmod +x install-nethunter-full-termux
Run Script ./install-nethunter-full-termux
Now Just Start Kali Nethunter nethunter
(OR)
termux-setup-storage
pkg install wget
wget -O install-nethunter-termux https://offs.ec/2MceZWr
chmod +x install-nethunter-termux
./install-nethunter-termux
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
👉 SOLVE FUNCTIONALITY PROBLEMS OF KALI TOOLS
Check Repository -> git clone https://github.com/Dewalt-arch/pimpmykali.git
OR
Update and upgrade Kali
👉 INSTALLATION OF KALI TOOLS
*Install kali tools ( install katoolin)
OR
*Easier option
Install Advance Pentesting Tools
sudo apt install offsec-pwk
sudo apt install offsec-awae
System
kali-linux-core: Base Kali Linux System – core items that are always included
kali-linux-headless: Default install that doesn’t require GUI
kali-linux-default: “Default” desktop (amd64/i386) images include these tools
kali-linux-light: Kali-Light images use this to be generated
kali-linux-arm: All tools suitable for ARM devices
kali-linux-nethunter: Tools used as part of Kali NetHunter
Desktop environments/Window managers
kali-desktop-core: Any key tools required for a GUI image
kali-desktop-e17: Enlightenment (WM)
kali-desktop-gnome: GNOME (DE)
kali-desktop-i3: i3 (WM)
kali-desktop-kde: KDE (DE)
kali-desktop-lxde: LXDE (WM)
kali-desktop-mate: MATE (DE)
kali-desktop-xfce: Xfce (WM)
Tools
kali-tools-gpu: Tools which benefit from having access to GPU hardware
kali-tools-hardware: Hardware hacking tools
kali-tools-crypto-stego: Tools based around Cryptography & Steganography
kali-tools-fuzzing: For fuzzing protocols
kali-tools-802-11: 802.11 (Commonly known as “Wi-Fi”)
kali-tools-bluetooth: For targeting Bluetooth devices
kali-tools-rfid: Radio-Frequency IDentification tools
kali-tools-sdr: Software-Defined Radio tools
kali-tools-voip: Voice over IP tools
kali-tools-windows-resources: Any resources which can be executed on a Windows hosts
Menu
kali-tools-information-gathering: Used for Open Source Intelligence (OSINT) & information gathering
kali-tools-vulnerability: Vulnerability assessments tools
kali-tools-web: Designed doing web applications attacks
kali-tools-database: Based around any database attacks
kali-tools-passwords: Helpful for password cracking attacks – Online & offline
kali-tools-wireless: All tools based around Wireless protocols – 802.11, Bluetooth, RFID & SDR
kali-tools-reverse-engineering: For reverse engineering binaries
kali-tools-exploitation: Commonly used for doing exploitation
kali-tools-social-engineering: Aimed for doing social engineering techniques
kali-tools-sniffing-spoofing: Any tools meant for sniffing & spoofing
kali-tools-post-exploitation: Techniques for post exploitation stage
kali-tools-forensics: Forensic tools – Live & Offline
kali-tools-reporting: Reporting tools
OTHER ESSENTIAL TOOLS
kali-linux-large: Our previous default tools for amd64/i386 images
kali-linux-everything: Every metapackage and tool listed here
kali-tools-top10: The most commonly used tools
kali-desktop-live: Used during a live session when booted from the image
👉 MAC ADDRESS SPOOFING + ANONSURF
packeth - Mac Address spoofing and Network Detection Evasion
OR
arp- a -> Windows check mac address
sudo apt install macchanger
macchanger -r eth0
=>AUTOMATING MAC ADDRESS SPOOFING
crontab -e
@reboot macchanger -r eth0
----------------------------------------------------------------------------------
👉 MAKE YOUR IP ADDRESS ANONYMOUS
🧨🧨 USING (Github anonymous-browsing-tool) AUTO TOR ====> IP ADDRESS [Application and Data Link Layers]
1. git clone https://github.com/devopsengineerDan/anonymous-browsing-tool.git
2. cd anonymous-browsing-tool
3. python3 install.py
4. aut
5. 5 ===> 5 seconds
6. 0 ===> [0 to infinte IP change]
7. go to your browser / pc change sock proxy to 127.0.0.1:9050
🧨🧨 USING (Github anonsurf-tool)ANONSURF ====> MAC ADDRESS [Presentation, Session, Transport, Network & Data link Layers]
I. Cloning the anonsurf repo in the local system
Open a terminal window by using Ctrl+Alt+T and execute the apt command to download anonsurf by cloning anonsurf from the GitHub repository:
$ git clone https://github.com/Und3rf10w/kali-anonsurf.git
II. Installing anonsurf
Now, Change the directory to where anonsurf is already downloaded by executing the following command:
$ cd kali-anonsurf/
III. Changing File permissions of the installed
Before we execute the installer, we need to change the file permissions using the following command,
$ chmod +x installer.sh
IV. Executing the installer
Then, install anonsurf using the command mentioned below. Press Y to confirm when prompted. This process will take few minutes after which we should be able keep our personal data clear off the prying eyes.
$ sudo ./installer.sh
Anonsurf is installed successfully. To check if the installation of the anonsurf has succeeded we can simply enter,
$ anonsurf
V. Working with various anonsurf options
We can start anonsurf service in anon mode by running the following command:
sudo anonsurf start
=> Changing the identity using tor
What the change option effectively does is that it changes the tor circuit. Now, all the computer traffic is routed via 3 nodes by default namely Guard, relay, and exit.
Depending on the flavour of the installation it could be sudo anonsurf changeid or just use:
$ sudo anonsurf change
=> Checking the current IP address
To check your current IP address, execute the following command:
$ sudo anonsurf myip
=> Checking the current status of the network
To check the status, Run the following command:
$ sudo anonsurf status
=> Stopping the anonsurf service
To stop anonsurf, execute the following command:
$ anonsurf stop
=> Get more information on anonsurf
To know more about anonsurf, You can refer to the help page by running the following command:
$ anonsurf help
🧨 MANAGING NETWORK MONITORING AND MALWARE DETECTION => Aurora software
🧨🧨🧨 Threat Protection, Detection and Response => Blumira
🧨 SECURITY IN SOFTWARE DEVELOPMENT => STRIDE Software Design Development
🧨 Cloud Access Security Broker CASB => Microsoft Defender for Cloud
🧨 Cloud Broker
🧨 MANAGING LOGS FOR AN AUDITOR => SIEM system e.g Splunk software
🧨 Install anonsurf => For Hiding IP(Making you anonymous)
🧨 Type kali-undercover => Change desktop environment to Windows
WIFI => 🧨 sudo apt install wifiphisher cri-tools hubble imhex kustomize rekono-kbx rz-ghidra
Rasberry Pi => 🧨 sudo apt install netexec netscanner obsidian sippts sprayhound
👉 WIFI HACKING
NULLBYTE -> (Automated) https://www.youtube.com/watch?v=qpnpI_mF3Aw
DAVID BOMBAL -> (Detailed) https://www.youtube.com/watch?v=J8A8rKFZW-M
Wireless Hacking WIFITE WIFI PHISHER -> https://www.youtube.com/watch?v=FVBti1vhJVA
👉 STEAL WIFI PASSWORD
CYBER MENTOR -> https://www.youtube.com/watch?v=1nWPbKitj9w
👉 API HACKING
JOHN HAMMOND -> How To Hack APIs with Python -> https://www.youtube.com/watch?v=NIceKjhZhrI
👉 INCIDENT RESPONSE
MITRE ATT&CK FRAMEWORK -> https://mitre-attack.github.io/attack-navigator/
MITRE ATT&CK FRAMEWORK Tutorials
Basic -> https://www.youtube.com/watch?v=WmQPtk3Ybxs
Advance -> https://www.youtube.com/watch?v=ujaoOWmkGLY
👉 PENTESTING REDTEAM FUNDAMENTALS
HACKERSPLOIT -> https://www.youtube.com/watch?v=ChgqGBwl8NQ&list=PLBf0hzazHTGMjSlPmJ73Cydh9vCqxukCu&index=16
Edureka Ethical Hacking 10hours -> https://www.youtube.com/watch?v=dz7Ntp7KQGA
👉 SOC ANALYST TOOLS
AlienVault: https://bit.ly/42Z9mzG
LogRhythm: https://bit.ly/3NkUaXy
IBM Q-Radar: https://bit.ly/3prU4FW
Splunk: https://bit.ly/44mPdV4
𝐈𝐃𝐒 𝐓𝐨𝐨𝐥𝐬:
OSSEC: https://bit.ly/46rKovu
Snort: https://bit.ly/3PrO4Yi
𝐄𝐃𝐑/𝐗𝐃𝐑 𝐓𝐨𝐨𝐥𝐬:
CrowdStrike: https://bit.ly/3Nw9L73
Carbon Black: https://bit.ly/3D85MZD
Sentinel One: https://bit.ly/3CLgp4l
Defender for Endpoint: https://bit.ly/3r1ZyYq
𝐖𝐢𝐫𝐞𝐬𝐡𝐚𝐫𝐤:
Wireshark Masterclass: https://bit.ly/45TkEXp
👉 𝗠𝗮𝗹𝘄𝗮𝗿𝗲 𝗔𝗻𝗮𝗹𝘆𝘀𝗶𝘀 𝗖𝗼𝘂𝗿𝘀𝗲 𝗮𝗻𝗱 𝗖𝗲𝗿𝘁𝗶𝗳𝗶𝗰𝗮𝘁𝗶𝗼𝗻
1) Ultimate Malware Analysis by Zero2Automated
https://lnkd.in/dN7v2zNj
2) Practical Junior Malware Research by TCM Security
https://lnkd.in/dR6mTmQ8
3) Giac Reverse Engineering and Malware by SANS
https://lnkd.in/d6UvAbun
4) Certified Malware Analyst by Ethical Hackers Academy
https://lnkd.in/dt3xQFQT
5) Malware Analysis by Red Team Academy
https://lnkd.in/duBMbZV8
6) Malware Analysis Course by Black Storm Security
https://lnkd.in/dwhn_uuH
7) Practical Malware Analysis: The Hands-On Guide to Dissecting Malicious Software
https://lnkd.in/dmyhKDBV
8) Malware Analysis Fundamentals by Let's Defend
https://lnkd.in/dSDUeyP7
9) Malware Analysis Detection Engineering
https://lnkd.in/dSEA37wQ
10) Malware Analysis Master by Mandiant
https://lnkd.in/dNM54d2C
11) CS6038/CS5138 Malware Analysis
https://class.malware.re/
12) Malware Analysis CSCI 4976 by RPISEC
https://lnkd.in/dC7kZkAK
13) Reverse Engineering 101 by Malware Unicorn
https://lnkd.in/dwGu22if
14) Purple Team Analyst by CyberwarfareLabs
https://lnkd.in/dBAqYG3j
15) Advanced Endpoint Investigation
https://lnkd.in/dgTBDWaW
👉 RED TEAM TOOLS
1-NETWORK ANALYSIS
·Wireshark: https://www.wireshark.org
·pfSense: https://www.pfsense.org
·Arkime: https://arkime.com
·Snort: https://www.snort.org
Suricata: https://suricata.io/
2-OS ANALYSIS
·Helk: https://lnkd.in/di4rQuNb
·Volatility: https://lnkd.in/dBr4yVYa
·RegRipper: https://lnkd.in/dq2hTNQw
·OSSEC: https://www.ossec.net
·osquery: https://osquery.io
3-INCIDENT MANAGEMENT
·TheHive: https://lnkd.in/dkR-d4JB
·GRR Rapid Response: https://lnkd.in/d42-6faP
4-HONEYPOTS
·Kippo: https://lnkd.in/d2ypa3j4
·Cowrie: https://lnkd.in/dAR68JQt
·Dockpot: https://lnkd.in/dgn7MpQg
·HonSSH: https://lnkd.in/dMKptyHz
5-THREAT INTELLIGENCE
·Misp: https://lnkd.in/dkcbKsTN
·MSTICPy: https://lnkd.in/dBjgWVqY
.Threatpost: https://threatpost.com/
Dark Reading: https://lnkd.in/df3bwx2n
6-EDR
·Cortex XDR: https://lnkd.in/devusd8T
·Cynet 360: https://lnkd.in/dZTXUwBE
·FortiEDR: https://lnkd.in/daTMkVxb
.Xcitium: https://lnkd.in/d6U4_auq
7-SIEM
·OSSIM: https://lnkd.in/dXegU3-5
·Splunk: https://www.splunk.com
·LogRhythm: https://logrhythm.com
·Wazuh: https://wazuh.com
.Qradar: https://lnkd.in/dCtbkk-H
👉 BLUE TEAM TOOLS
IP & URL Reputation
🌀1. Virus Total : https://lnkd.in/eweERpju
🌀2. URL Scan : https://urlscan.io/
🌀3. AbuseIPDB: https://www.abuseipdb.com/
🌀4. Cisco Talos: https://lnkd.in/g7uWdC5q
🌀5. IBM X-Force: https://lnkd.in/gt8iyHE5
🌀6. URL Filtering(Palo Alto): https://lnkd.in/e4bkm5Eq
🌀7. URL Filtering(Symantec): https://lnkd.in/g4qQGsHG
🌀8. IP Void: https://www.ipvoid.com/
🌀9. URL Void: https://www.urlvoid.com/
2️⃣File | Hash | Search | Analysis | Sandboxing
🌀1. File Extension >>https://filesec.io/#
🌀2. LOLBAS >>https://lnkd.in/dDa8XgiM
🌀3. GTFOBins >>https://lnkd.in/dRVzVz87
🌀4. File Hash Check >> https://lnkd.in/gNqxtn4d
🌀5. Hash Search >> https://lnkd.in/eMjdTB2t
🌀6. Hash Search >> https://www.malwares.com/
🌀7. MetaDefender >> https://lnkd.in/e6r4mGv5
🌀8. Kaspersky Threat Intel. >> https://lnkd.in/eSNMn7au
🌀9. Cuckoo Sabdbox >> https://cuckoosandbox.org/
🌀10. AnyRun >> Online sandboxing >> https://any.run/
🌀11. Hybrid-Analysis >> https://lnkd.in/gaRGY8kB
🌀12. Joe Sandbox >> https://lnkd.in/gTJJ9GiC
🌀13. VMRay Sandbox >> https://www.vmray.com/
🌀14. Triage >> http://tria.ge/
🌀15. Browser Sandbox >> https://lnkd.in/gjA-QqdX
3️⃣ File hash
🌀HashTools> Windows > https://lnkd.in/gTjru2RQ
Powershell :
🌀Get-FileHash -Path C:\path\to\file.txt -Algorithm MD5
Get-FileHash -InputObject "This is a string" -Algorithm MD5
🌀QuickHash > MacOS > https://lnkd.in/gZc8FYpU
Terminal: shasum -a 256 filename
4️⃣Find Suspicious Artifacts | Reverse Engineer | Debug Files
🌀1. PeStudio: https://lnkd.in/gjYKbyge
🌀2. CFF Explorer: https://lnkd.in/ggTCTeAi
🌀3. DocGuard files: https://www.docguard.io/
🌀4. File Scan: https://lnkd.in/ejBt5R7C
🌀5. Ghidra >> https://ghidra-sre.org/
🌀6. IDA Pro >>https://lnkd.in/eWA9MnMY
🌀7. Radare2/Cutter >>https://lnkd.in/gV4k5Gsw
https://lnkd.in/gdb3MQn2
5️⃣Monitor System Resources | Detect malware
🌀Process Hacker >> https://lnkd.in/gxV3PAnG
🌀Process Monitor >> https://lnkd.in/gPqzyB7K
🌀ProcDot >> https://www.procdot.com/
🌀Autoruns >> https://lnkd.in/gkZqkZrd
🌀TcpView >>https://lnkd.in/gQZM_SJz
6️⃣Web proxy
🌀Fiddler >> https://lnkd.in/gnJ9BvFN
7️⃣Malware Samples - Abuse.ch
🌀MalwareBazaar ==> https://bazaar.abuse.ch/
🌀FeodoTracker ==> https://lnkd.in/gyN_diCQ
🌀SSLBlacklist ==> https://sslbl.abuse.ch/
🌀URLHaus ==> https://urlhaus.abuse.ch/
🌀ThreatFox ==> https://lnkd.in/gB2gDZUd
🌀YARAIfy ==> https://yaraify.abuse.ch/
👉 PENTESTING COURSES
EC-Council Cybersecurity SCHOLARSHIP
https://campaigns.eccouncil.org/cyber-career-starter-scholarship?utm_source=pop-up&utm_medium=essentials&utm_campaign=cct-global-scholarship
🗝️Penetration Testing Course by TCM Security
https://lnkd.in/eb4ieU7b
🗝️Penetration Testing Course by IT Masters
https://lnkd.in/ei6RrNXw
🗝️Internet of Things Intro by Stanford
https://lnkd.in/eCrhKFKR
🗝️Free Cyber Defense Training by Mandiant
https://lnkd.in/eSfbiTeY
🗝️Digital Forensic Courses List by DFIRDiva
https://lnkd.in/e4RRihcn
🗝️Blue Team Training Course by Security Blue
https://lnkd.in/ehwEzNfp
🗝️Splunk Free Course by Splunk
https://lnkd.in/e7CP_uqU
🗝️Cyber Threat Intelligence 101 by Arcx
https://lnkd.in/ePqK-fNN
🗝️Cyber Security Free Course by Palo Alto
https://lnkd.in/eiFkSHiK
👉 DEVELOPER COURSES
OSINT: Content Discovery
https://lnkd.in/gkHuEfPw
Hack Network PCs
https://lnkd.in/gRgYwd7h
Cyber Security: Best Practices, Trends, Technologies &Myths
https://lnkd.in/gMXqJ5uH
Ultimate Jenkins Certified Engineer Practice Exam 2023 | JCE
https://lnkd.in/gdKkVhxa
Setup Ubuntu Server
https://lnkd.in/gjWxid45
Linux Terminal for beginners
https://lnkd.in/gUQgHC84
Git for Beginners
https://lnkd.in/gmDEHZhC
The Ultimate MySQL Crash Course 2023
https://lnkd.in/gibwKktX
Disaster Recovery (BCDR) in Azure using ASR, Chaos Studio
https://lnkd.in/gQ2iKSYq
OOP Design Patterns in Python
https://lnkd.in/gyJMtu4q
PCEP Certification Exam Preparation: 4 Practice Tests
https://lnkd.in/g9v7kffm
Python Certification Exam Preparation: 6 Practice Tests
https://lnkd.in/ggJFr5ff
Python Certification Exam PCEP-30-02 - Preparation (2023)
https://lnkd.in/gXjdFrAM
Python Programming- Practice Tests & Interview Questions
https://lnkd.in/gHi6xVRx
Python And Flask Framework Complete Course For Beginners
https://lnkd.in/gbKgt2Kq
Scrum Master Certification Mock Exams 2023+Agile Scrum
https://lnkd.in/gV-TncUE
Practice Exams (3 Sets) - Docker Certified Associate (DCA)
https://lnkd.in/g7skvn4H
Linear Programming basics
https://lnkd.in/gvs-Fc7X
Master all the MS Excel Macros and the basics of Excel VBA
https://lnkd.in/gzbt_vCB
Master Android Application Build 3 Applications from Scratch
https://lnkd.in/gp8kqmEx
Android App's Development Masterclass - Build 2 Apps - Java
https://lnkd.in/gQZzapBH
Fastest Laravel app building trick
https://lnkd.in/gS8S2zVN
C Programming Language- Practice Tests & Interview Questions
https://lnkd.in/g-KfQF2v
Debugging Basics
https://lnkd.in/gZA7exPA
React JS Coding Interview Challenges and Exercises [2023]
https://lnkd.in/g5s6ed2M
Master Prototype & Product Design, Development & Engineering
https://lnkd.in/gwftYdjA
ISO 14001:2015 - Awareness on Environment Management (EMS)
https://lnkd.in/gJMU7xsJ
Business Continuity Management Awareness on ISO 22301
https://lnkd.in/gCrWwaAe
👉 CYBERSECURITY FUNDAMENTALS
**— Cyber Essentials —**
📘 ISC(2) Certified in Cyber - https://lnkd.in/e6jB_6af
📘 Cyber Security - https://lnkd.in/eueCSF6A
📘 Cisco Cyber Induction - https://lnkd.in/e8C3jacc
📘 Cisco Cyber Essentials - https://lnkd.in/eTQNsbyF
📘 Fortinet NSE - https://lnkd.in/es3c_Q6E
**— Hacking —**
📗 CodeRed Hacking Essentials - https://lnkd.in/eJbyZp_9
📗 PortSwigger Web Hacking - https://lnkd.in/eEa-fNfu
**— Engineering —**
📙 Secure Software Development - https://lnkd.in/ebGpA4wG
📙 Maryland Software Security - https://lnkd.in/e3z4zFmJ
📙 Stanford Cyber Resiliency - https://lnkd.in/eg9BM5Bv
**— Core Skills —**
#Cisco Network Essentials - https://lnkd.in/eJQFjmcA
#AWS Cloud - https://lnkd.in/e_auX7VE
#Azure Cloud - https://lnkd.in/e4UacMGK
#GCP Cloud - https://lnkd.in/eDNWnVsD
👉 WEB APPLICATION TOOLS
💻Burp Suite - Framework.
https://lnkd.in/gwV8u6Bp
💻ZAP Proxy - Framework.
https://www.zaproxy.org/
💻wfuzz - bruteforcing web
https://lnkd.in/gyyhkZQF
💻Wapiti
https://lnkd.in/gDaTe5XV
💻w3af
http://w3af.org/
💻Dirsearch - HTTP bruteforcing.
https://lnkd.in/gQ-qWnCw
💻Nmap - Port scanning.
https://nmap.org/
💻Sublist3r - Subdomain discovery.
https://lnkd.in/g4SG9neM
💻Amass - Subdomain discovery.
https://lnkd.in/gpBABMBf
💻SQLmap - SQLi exploitation.
http://sqlmap.org/
💻Nogotifail
https://lnkd.in/g8PMWqiw
💻Iron Wasp
https://ironwasp.org/
💻Arachni
https://lnkd.in/gyD3s7EN
💻Metasploit - Framework.
https://lnkd.in/gmnyrxuv
💻WPscan - WordPress exploitation.
https://wpscan.com/
💻Nikto - Webserver scanning
https://lnkd.in/gV8ErHDW
💻HTTPX - HTTP probing.
https://lnkd.in/g83CQ5Wn
💻Nuclei - YAML based template scanning.
https://lnkd.in/gkp4CK2v
💻FFUF - HTTP probing.
https://lnkd.in/ghRhwWGe
💻Subfinder - Subdomain discovery.
https://lnkd.in/gmpujTAi
💻Masscan - Mass IP and port scanner.
https://lnkd.in/g_9PP_7F
💻Lazy Recon - Subdomain discovery.
https://lnkd.in/geT-Bm9t
💻XSS Hunter - Blind XSS discovery.
https://xsshunter.com/
💻Aquatone - HTTP based recon.
https://lnkd.in/gvbB7Rgd
💻LinkFinder - Endpoint discovery through JS files.
https://lnkd.in/gfRM6Sq6
💻JS-Scan - Endpoint discovery through JS files.
https://lnkd.in/gCp2EyRY
💻Parameth - Bruteforce GET and POST parameters.
https://lnkd.in/gsPF4pTM
💻truffleHog - Find credentials in GitHub commits.
https://lnkd.in/gc5p5Jss
👉 APPLICATION PENTESTING TOOLS
✔️WordPress admin finder
🔗 Link : https://lnkd.in/dyvDJ9dX
✔️Smb scanner tool
🔗 Link : https://lnkd.in/dgpBTFpa
✔️Heart Bleed scanner
🔗 Link : https://lnkd.in/dcG385GB
✔️weevely php web shell
🔗 Link : https://lnkd.in/dxAYb-Va
✔️Webponized web shell
🔗 Link : https://lnkd.in/dbQHEwrG
✔️Nikto web scanner tool
🔗 Link : https://lnkd.in/dYE8NcMW
✔️Auto Ip or domain Attacking Tool
🔗 Link : https://lnkd.in/duch7ggE
✔️Click jacking vulnerability scanner
🔗 Link : https://lnkd.in/dpirNWRG
✔️All in 1 information gathering and web penetration tool DTect
🔗 Link : https://lnkd.in/dSH-WanK
✔️Detect phishing URL
🔗 Link : https://lnkd.in/djeHhvb3
✔️Dos attack tool - Golden eye
🔗 Link : https://lnkd.in/dHci7XAf
✔️Dos attack with hulk
🔗 Link : https://lnkd.in/dZMYVfs5
✔️Sql vulnerability scanner
🔗 Link : https://lnkd.in/dv5vFTDK
✔️hack website with sqlmap
🔗 Link : https://lnkd.in/dDah_N5u
✔️information and vulnerability scanner with striker
🔗 Link : https://lnkd.in/dViQEykp
✔️web server attacking tool with dost
🔗 Link : https://lnkd.in/d-pcrbA8
✔️advanced multithreaded admin panel finder
🔗 Link : https://lnkd.in/daHfwAZU
✔️Ssl vulnerability scanner
🔗 Link : https://lnkd.in/dmxRyixm
✔️sublister - Subdomain enumeration
🔗 Link : https://lnkd.in/d2R6A_-B
✔️WordPress vulnerability scanner and attacker
🔗 Link : https://lnkd.in/dTi9viG5
✔️Hunner scanner framework
🔗 Link : https://lnkd.in/dceH9pM6
✔️Red hawk all in 1 information gathering and scanning tool
🔗 Link : https://lnkd.in/dkBb-8yp
✔️Dos attack tool with Xerxes
🔗 Link : https://lnkd.in/dbfy73xq
✔️social fish phishing tool
🔗 Link : https://lnkd.in/dAXKiEZ8
✔️weeman phishing tool no root
🔗 Link : https://lnkd.in/dq4NiqW5
✔️WordPress security scanner Wpseku
🔗 Link : https://lnkd.in/dDwTzntt
✔️IDN homograph attack tool
🔗 Link : https://lnkd.in/dpVkBe3r
✔️Detect security flaws with CMS
🔗 Link : https://lnkd.in/d4dGXz2k
✔️Fire crack , admin, finders, deface, bing dorking etc
🔗 Link : https://lnkd.in/dkXhiR5c
✔️Pish web tool
🔗 Link : https://lnkd.in/dcDFz2xX
✔️MITM attack tool
🔗 Link : https://lnkd.in/dvCRNsZq
✔️kill shot pentesting framework
🔗 Link :https://lnkd.in/dUvWJMVW
👉 MALWARE ANALYST TOOLS
🧨Cuckoo Sandbox: An open-source sandbox that provides automated analysis of suspicious files and behaviors.
🔗https://lnkd.in/dgzZiYzY
🧨Any.Run: A web-based interactive sandbox that allows users to run and analyze malware in real-time.
🔗 https://any.run/
🧨Joe Sandbox: Community Edition of Joe Sandbox offers a free version for malware analysis.
🔗 https://lnkd.in/dwjPpwQT
🧨IRMA (Incident Response & Malware Analysis): An open-source platform for analyzing malware and performing incident response
🔗 https://lnkd.in/dzzwCHsk
🧨Viper: A framework for analyzing malware samples. It provides a user-friendly web interface and supports multiple analysis tools.
🔗 https://lnkd.in/d6-U6YY4
🧨Limon Sandbox: An open-source sandbox developed to automate the analysis of Linux malware.
🔗 https://lnkd.in/dzM2--Ux
🧨BinaruGuard: Automated malware analysis on a revolutionary bare-metal platform in the cloud.
🔗https://lnkd.in/dnEF-h-d
🧨Hybrid Analysis: A cloud-based platform that combines sandbox analysis with threat intelligence feeds.
🔗 https://lnkd.in/dz3rmuwy
🧨Filescan: operating at 10x speed compared to traditional sandboxes with 90% less resource usage
🔗 https://lnkd.in/dgQM25_B
🧨Sandblast: Its a malware analysis system by Checkpoint
🔗 https://lnkd.in/drBEghCn
===============================================================================================================================================
☀️ Learn SQL
✅ Aggregations with GROUP BY
✅ Joins (INNER, LEFT, FULL OUTER)
✅ Window Functions
✅ Normalisation
✅ Constraints in SQL
✅ Common Table Expressions
✅ How to design and implement data models and databases using tools such as MySQL, PostgreSQL
☀️ Learn Python Programming
✅ Learn Data Structures, Loops, If Statements
✅ How to use Regular Expressions for data cleaning and extraction tasks
✅ Understand Recursion Functions
✅ Understand Solid Libraries in Python like Pandas, Numpy, Scikit-learn, and Great Expectations
✅ Learn about unit and integration testing
☀️ Learn about Data Modeling
✅ Read about data normalization and 3rd normal form
✅Read about fact, dimension, and aggregate tables
✅ Read about efficient table designs such as cumulative
✅ Learn about complex data types like MAP, ARRAY, and STRUCT
☀️ Learn about Cloud and Distributed Computing
✅ Learn about Shell Scripting in Linux. Most data engineers use shell scripting for cron jobs, setting up environments, or even working in the distributed environment.
✅ Gain knowledge of working with APIs GET, PUT, and POST. (Requests Library in Python)
✅ Learn the basics of Web Scraping with BeautifulSoup and Selenium libraries
✅ Master Git and Github for version control
✅ Learn Cloud Computing Fundamentals with AWS, Google Cloud and Microsoft Azure
✅ Read about the MapReduce architecture and see how it guided the design of today's distributed computing
✅ Understanding data integration techniques and best practices.
✅ Gain experience working with ETL tools such as Apache NiFi, Apache Kafka, and Talend
✅ Learn about partitioning, skew, and spilling to disk
☀️ Learn about Data Quality, Data Governance and Security
✅ Understanding of data governance principles and best practices
✅ How do you write a good data check?
✅ How can you implement the write-audit-publish pattern in your pipelines?
✅ Gain familiarity with data quality and data profiling tools to ensure the accuracy of the data being integrated
✅ Knowledge of data security, privacy, and compliance regulations such as GDPR and CCPA
☀️ Learn about Job Orchestration
✅ Read about CRON
✅ Try out a Scheduler like Airflow or Prefect
☀️ Apply Distributed Compute Principles
✅ A free trial with Snowflake or BigQuery is a good place to start
✅ Try out Apache Spark + S3 if you want to be adventurous
✅Gain knowledge of DevOps principles and CI/CD pipelines
===============================================================================================================================================
SOFTWARE ENGINEERING (QUANTUM AI, QUANTUM MECHANICS ,AI && DIGITAL COMPUTING) : (ANALOGUE ELECTRONICS AND MECHANICS)
🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨 CONFIGURATIONS 🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨
👉 DOMAIN HOSTING
👉 WEB HOSTING
👉 EMAIL HOSTING
👉 SEARCH ENGINE OPTIMIZATION
=> AI
🌀 Patrick Loeber
🌀 Dr. Alex Loannides
=> FOREX
🌀 Nick McCullum -> Trading Algorithm Basics -> https://www.youtube.com/watch?v=xfzGZB4HhEE
🌀 TradeOptionsWithMe Detailed -> https://www.youtube.com/watch?v=EeSrQdab0s0&list=PLtqRgJ_TIq8Y6YG8G-ETIFW_36mvxMLad
=> SOFTWARE
🌀 Fireship
🌀 Isaac Harris-Holt
🌀 CodeDoner
🌀 free codecamp
👉 DEVELOPMENT SETUP
👉 INSTALL GOOGLE-CHROME
Install in Softwares
OR
wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
sudo apt install ./google-chrome-stable_current_amd64.deb
sudo apt install google-chrome-stable
👉 GITLAB
👉 JETBRAINS
1 Toolbox
2 Space (Internet)
3 Datalore (Internet)
4 Dataspell
5 Projector (Internet)
6 Pycharm -> Rust and Micropython plugin
7 Rustrover
8 Datalore
9 Data Grip
👉 CLOUD-BASED DEVELOPMENT ENVIRONMENT
Project IDX
Github Codespaces
👉 Alternative for JETBRAINS
Anaconda
VScode
👉 Vscode A MUST Installation
✅ [CODE RUNNER] => Download in windows
code runner
✅ [VSCODE PLUGINS]
VSCode Icons Team - vscode-icons
✅ [VIM]
vscodevim - Vim
✅ [GITHUB]
Github - Github Copilot
Github - Github Copilot Chat
Github - Github Theme
Gitlens
✅ [LIVE SHARE]
Microsoft - All Live Share plugins
✅ [HTML]
✅ [XML]
RedHat - xml
✅ [CSS]
Syler - sass
✅ [PYTHON]
Microsoft - Python
Microsoft - Pylance
Microsoft - Flake8 -> [Standard]
Microsoft - Pylint
Microsoft - Black Formatter
Microsoft - Mypy ->[Standard]
Microsoft - autopep8 -> [Standard]
Microsoft - isort
Microsoft - Tensorboard
Kevin Rose - Python Indent
✅ [INTELLICODE]
Microsoft Intellicode Plugins
✅ [RUST]
rust-analyzer
CodeLLDB
Even Better TOML
crates
✅ [C]
Microsoft - C/C++ IntelliSense
Microsoft - C/C++ Extension Pack
Microsoft - C/C++ Themes
Microsoft - CMake Tools
LLVM - clangd
✅ [POSTMAN]
Postman
✅ [POSTGRESQL]
Microsoft - postgreSQL
✅ [COUCHBASE]
Couchbase
Couchbase - Couchbase Lite
✅ [SURREALQL]
SurrealDB - SurrealQL
✅ [PRETTIER SQL]
inferrinizzard - Prettier SQL VSCode
✅ [JUPYTER]
Microsoft - All Jupyter plugins
✅ [MARKDOWN PREVIEW MERMAID]
Matt Bierner - Markdown Preview Mermaid
✅ [YAML]
RedHat - yaml
✅ [ROS]
Microsoft ROS
✅ [ARDUINO]
Microsoft - Arduino
✅ [SERIAL MONITOR]
Microsoft - Serial Monitor
👉 WINDOWS PLATFORM
Anaconda
VScode
Pycharm -> Rust and Micropython plugin
Rustrover
👉 C LANGUAGE
Conan package manager -> MUTIPLATFORM and FLEXIBLE
vcpkg Windows and Linux os package manager -> RIGID and mostly for private usecase (i.e) Microsoft
👉 EXECUTING PYTHON (INTERPRETER) CODE
--------------------------------------------------------------------------------------------------------------------------------------
1 BASH EXECUTION OF PYTHON IN LINUX (Here do not use pyinstaller) -> https://www.youtube.com/watch?v=vEQ8CXFWLZU
2 DESKTOP EXECUTION OF PYTHON IN LINUX (Here do not use pyinstaller) -> https://www.youtube.com/watch?v=9CTmC5Y7QeM
3 PYINSTALLER -> https://www.youtube.com/watch?v=p3tSLatmGvU
LINUX
Debreate for debian distribution [.deb]
Other alternatives -> https://alternativeto.net/software/inno-setup/?platform=linux
WINDOWS [.exe]
Traditional -> Pyinstaller(Setup must be within the parent directory during execution)
Modern -> Inno Setup(Setup is compiled with all the dependencies required, hence must NOT be within the parent directory during execution)
👉 PYINSTALLER => https://www.youtube.com/watch?v=p3tSLatmGvU
pyinstaller --onedir --onefile --name=passwordlocker --windowed "/home/dan/PycharmProjects/dancunmanyinsa/manage.py"
OR
In the project directory, when the development virtual environment is deactivated, type this on the terminal
pyinstaller main.py --onefile --noconsole
-------------------------------------------------------------------------------------------------------------------------------------
👉 REMOVE JAVA, DOCKER, KUBERNETES COMPLETELY
For java:
sudo apt remove default-jre
sudo apt-get remove --purge icedtea-* openjdk-*
Then check if there is java, by running: (If it returns nothing java dependencies have been removed)
sudo dpkg --list | grep -i jdk
👉 GOOGLE-CHROME INSTALLATION ON DEBIAN DISTRIBUTION
wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
sudo apt install ./google-chrome-stable_current_amd64.deb
sudo apt install google-chrome-stable
👉 INSTALL VSCODE
FLATHUB [BEST OPTION]
OR
FROM OFFICIAL MICROSOFT REPOSITORY
Step 1: Update the system repository
Run the following command to update the system's repository and ensure you get the latest vscode version:
$ sudo apt update
Step 2: Install Package Dependencies
For proper operation, vscode requires you to install package dependencies. Run the following command to resolve package dependencies:
$ sudo apt install software-properties-common apt-transport-https wget -y
Step 3: Add GPG Key
Import the GPG key provided by Microsoft to verify the package integrity. Enter:
$ wget -q https://packages.microsoft.com/keys/microsoft.asc -O- | sudo apt-key add -
Step 4: Add Repository
Run the following command to add the Visual Studio Code repository to your system:
$ sudo add-apt-repository "deb [arch=amd64] https://packages.microsoft.com/repos/vscode stable main"
Step 5: Install vscode
After enabling the repository, install vscode by running:
$ sudo apt install code
Step 6: Verify Installation
Verify vscode installation by running:
$ code --version
👉 PYTHON TOPICS
PYTHON (INTERPRETER) DATA STRUCTURES
✅ IMMUTABLE
-Tuple -> Less memory space than list, has less functionalities.
-Set -> Disorderly arrangment, Different data types.
✅ MUTABLE
-List -> Orderly arrangement, Same type. More memory space than tuple, has more functionalities.
- Dictionary
✅ ADVANCE PYTHON (INTERPRETER) CONCEPTS
✅ OOP Fundamentals
File Handling
setup.py -> Use toml file
Dictionary
Generator
Metaclass (Knowledge assists you in developing framework i.e django ORM and flask SQL Alchemy, factory pattern, singleton)-> https://www.youtube.com/watch?v=uLPnBaUhjKU
Macro
✅ PYTHON POLYMORHISM
1. Abstraction
2. Inheritance
3. Generics
✅ FASTAPI CRASH COURSE ->https://www.youtube.com/watch?v=62pP9pfzNRs
👉 RUST TOPICS
✅ RUST CHEATSHEET
Rust tuples
Rust HashMap is equivalent to Python Dictionary
✅ Ownership and Borrowing:
1. Function in which initializes the mutable variable is the owner of the variable.
2. The function can manipulate the variable in any number of times.
3. Once the function exits the variable is not accessible to the borrower.
👉 C TOPICS
C1.png
👉 SOFTWARE PERFORMANCE BOOST
1. Reduce pixels in images.
2. Have local scripts on project folder"; Don't Use API's or LINKS.
3. Compress Front-end scripts. e.g htmx, css, pyscript
👉 SEARCH ENGINE OPTIMIZATION
1. <meta charset="UTF-8">
2. <meta name="Description" content="Details regarding the website or web application">
3. HTML Validation
4. Content is King - Have great content
5. Key Words and Back Links
🎯 SEARCH ENGINE OPTIMIZATION
<!-- This site is optimized with the Yoast SEO plugin v22.7 - https://yoast.com/wordpress/plugins/seo/ -->
<title>Sterling Crown Tutors</title>
<meta name="description" content="We offer high-quality academic tutoring for our students." />
<link rel="canonical" href="https://sterlingtuitioncentre.github.io/" />
<meta property="og:locale" content="en_US" />
<meta property="og:type" content="website" />
<meta property="og:title" content="Sterling Crown Tutors />
<meta property="og:description" content="We offer high-quality academic tutoring for our students." />
<meta property="og:url" content="https://sterlingtuitioncentre.github.io/" />
<meta property="article:modified_time" content="2024-8-05T12:44:22+00:00" />
<meta property="og:image" content="img/logo.png" />
<meta name="twitter:card" content="summary_large_image" />
🎯 => GENERATE THE KEY ON GOOGLE <meta name="google-site-verification" content="XtmfApHLWkQ32uN3BV8mb3LgLNFMF5h3UHFd0_JfIf8" />
<!-- / Yoast SEO plugin. -->
👉 THIRD-PARTY BOTS
LIVE CHAT BOT + NEWS LETTER => https://www.tidio.com/
🎯 LIVE CHAT BOT + NEWS LETTER => https://www.chatwoot.com/
🎯 NEWS LETTER => https://app.beehiiv.com/subscribers
🎯 WHATSAPP LIVE CHAT => https://elfsight.com/
WHATSAPP BOT =>https://portal.infobip.com/
👉 SOFTWARE DESIGN BEST PRACTICE
### SYSTEM ANALYSIS AND DESIGN
SYSTEM UML DESIGN
-> Use-case Diagram
-> Sequence Diagram
----------
* BDD - Behaviour Driven Development
* System Flow-chart Diagram
----------
### DATABASE ANALYSIS AND DESIGN
DATABASE FLOW-CHART DIAGRAM
👉 SOFTWARE ENGINEERING BEST PRACTICE
🎯 SECURITY: gitignore files with security keys and use OTP message to manage payments when consuming apis from payment application system.
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
🎯 DOTFILES ON MY GITHUB
🌀.gitignore(*.egg-info, *.pyc, .coverage, .tox, /venv*, coverage-html, dist)
🌀
# Ignore temporary files
*.tmp
*.temp
# Ignore log files
*.log
# Ignore build artifacts
/build/
# Ignore IDE specific files
.idea/
.vscode/
🌀 https://github.com/devopsengineerDan/dot-gitignore/blob/master/.gitignore
🌀 https://github.com/devopsengineerDan/gitignore/blob/main/
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
### 1. Adhere to Programming Language Standard library and Style of writing code.
### 2. Adhere to Software Design Principles and Patterns.
### 3. Naming of variables, functions, classes and modules should be explicit.
### 4. In python have __init__.py in folder containing app.
### 5. Have main function in a seperate file.
### 5. Software Testing. Software Configuration. Software Deployment.
### 6. Schedule more time. To avoid conflicting on deadline time.
### 7. High speed is best for prototyping. Low speed is best for large projects inorder to identify use-cases of design of principles and patterns on the projects for performance, reliability, scalability, adaptability, simplicity, correctness, security and upgradability.
👉 PYTHON (OFFICIAL) -> https://docs.python.org/
-> https://www.pythontutorial.net/python-oop/
👉 PYTHON EXTENSIONS/PACKAGES IN RUST
-> https://www.youtube.com/watch?v=jlWhnrk8go0
-> https://www.youtube.com/watch?v=DpUlfWP_gtg
BEST PRACTICE + (autopep8 idiomatic python using flake8, pytest and hypothesis, mypy => TESTING)
### 1. Initilialize always on Project Folder.
### 2. Always have a runtime.txt with version (e.g) python 3.8.6 for backward compatibility with python interpreter.
### 3. Always have a requirements.txt with version (e.g) django 4.0.10 for backward compatibility with python dependencies. OR flask>=1.1.2,<2.0
where 4: MAJOR VERSION - Incremented when major changes are made like architectural changes.
0: MINOR VERSION - Incremented when minor changes are made which does NOT break the API like adding a new feature
10: PATCH VERSION - Incremented with bug fixes.
### 4. Have requirements-dev.txt for testing and deployment(i.e heroku) packages.
### 👉 5. -Use __init__.py on package or library to be imported.
* In __init__.py file have this:
__all__ = ["module1", "module2"]
-Python dev tools (i.e) Python dev tools image
-Python dev imports (i.e) Python dev imports image
-Use bash for execution in main file. (i.e) #!/usr/bin/env python
-Use utf-8 for character encoding in main file. (i.e) # coding: utf-8
-Avoid using header links (i.e) HTML,CSS,Pyscript. Have files located locally on project folder
-Have pyproject.toml file to manage setup configurations, for flexibility in execution of program and for points(2,3 and 4).
(i.e) pip install -e .
### 6. Have .gitignore and have necessary files in there.
### 7. Use pipenv to manage virtual environments with python versions.
### 8. Write explicit code for readability and scalability.
### 9. Use Data Structures correctly for high performance and memory utilisation.
### 10. For engineering best practice use standard documentation,libraries and well established and supported frameworks.
SPEED UP PYTHON EXECUTION ON INTERPRETER
### 1. Reduce number of times you are using mutability of variables unless during prototyping before the scope is clearly defined. Example of mutable types: list, set, dictionary.
### 2. Use of type notations (i.e) mypy.
### 3. Use of pydantic model in API.
### 4. Use of struct.
SPEED UP PYTHON EXECUTION ON COMPILER
### 1. The above four points.
### 2. Use of let immut and let mut in ownership and borrowing.
### 3. Use of fn when performing function overloading, function overiding and using macros.
👉 RUST (OFFICIAL) -> https://www.rust-lang.org/learn
-> https://education.web3.foundation/docs/introrust
BEST PRACTICE + (clippy -> TESTING)
### 1. Rust simplicity and reduction of complexity.
### 2. Rust unsafe{} keyword for performance tuning in hardware. PERFORMANCE_USECASE(Code must be secure & Adhere to Standard Guidelines of unsafe principles) COMPILER CHECKS-> Must be checked by Software Engineer during Software Review and Testing.
### 3. Rust [no_std] for easier debugging hence less development time. EXPERIMENTATION_USECASE(Code must be secure & Adhere to Standard Guidelines of [std]) STD LIBRARY-> Must be checked by Software Engineer during Software Review and Testing.
### 4. Rust unwrap() function for easier error handling. Rust does not use exception handling. TESTING_USECASE(Code must be secure & Adhere to Standard Guidelines of Rust error handling) RUN-TIME/COMPILE-TIME ANNOTATIONS-> Must be checked by Software Engineer during Software Review and Testing.
### 5. Rust support for c toolchains, kernels and hardwares.
### 6. Rust editions for backward compatibility.
### 7. Rust Compiler and Cargo.
👉 C (OFFICIAL) -> https://en.cppreference.com/w/c/23
BEST PRACTICE Check the link ->>> https://www.youtube.com/watch?v=GWYhtksrmhE
+(gcc -c -x c -std=c89 -pedantic -pedantic-errors -Wall -Wextra -Wno-comment -Wno-parentheses -Wno-format-zero-length test.c
### 1A. Simple control flow (e.g) a function should be limited to only one task to ensure readability and should not exceed the size of a page.
### 1B. Also do not use goto, setjmp, longjmp statements and recursion which is a fuction that calls onto itself which results to difficult to understand and follow cyclic control flow crafts and run away code which can crush.
### 2. Limit loops, have an upper bound to an integer not a pointer.
### 3. Do not use heap to ensure memory use is deterministic. This prevents use after freeze and memory leaks.
### 4. Practise data hiding, this is declaring variables on how they are used. It restricts data access to class members.
### 5. Use indentatation to explain your code.
### 6. Check return value. If a return value is going to be ignored, you should explicitly cast the return value to a void type. (e.g) (void)printf(""); or (void)scanf("");
### 7. Dereference operations should not be hidden in macro definitions or inside typedef declarations.
### 8. Must dereference a pointer one layer at a time. This forces you to create structures that properly track your pointers to ensure they are not used incorrectly.
### 9. Limit the C preprocessor. (This more advanced and required an advanced user)
### 10. No use of function pointers. Lead to problem of obfuscating your code control flow graph of your program making it difficult to statically analyse your code and also fully test your code.
### 11. Compile your code with all warnings enabled and compile in pedantic mode (e.g) $gcc <main_cfile>.c -Wall -Werror -Wpedantic OR $gcc <main_moderncfile>.c -std=C2x -Wall -Wextra -pedantic ./a.out .This ensures all errors in your code willl be raised by the compiler.
### 12. Analyse code with multiple static code analysers with different rule sets.
### 13. Test your code with unit tests.
👉 Bash Shell BEST PRACTICE
### 1. Keep Bash Shell and related software up to date with security patches and updates.
### 2. Follow best practices for secure coding and scripting, including input validation, error handling, and secure data storage and transmission.
### 3. Implement user and group permissions and access controls to restrict access to sensitive data and system resources.
### 4. Use encrypted communication protocols, such as SSH, to protect against unauthorized access and data theft.
### 5. Configure firewall and network security settings to prevent unauthorized access and protect against external threats.
### 6. Monitor system logs and security events for signs of potential security threats or vulnerabilities.
### 7. Use security tools and software, such as intrusion detection and prevention systems, to detect and respond to potential security threats.
👉 APIS BEST PRACTICE
🎯 Secure your API with these 16 Practices
### 1. Authentication 🕵️♀️ - Verifies the identity of users accessing APIs.
### 2. Authorization 🚦 - Determines permissions of authenticated users.
### 3. Data Redaction 🖍️ - Obscures sensitive data for protection.
### 4. Encryption 🔒 - Encodes data so only authorized parties can decode it.
### 5. Error Handling ❌ - Manages responses when things go wrong, avoiding revealing sensitive info.
### 6. Input Validation & Data Sanitization 🧹 - Checks input data and removes harmful parts.
### 7. Intrusion Detection Systems 👀 - Monitor networks for suspicious activities.
### 8. IP Whitelisting 📝 - Permits API access only from trusted IP addresses.
### 9. Logging and Monitoring 🖥️ - Keeps detailed logs and regularly monitors APIs.
### 10. Rate Limiting ⏱️ - Limits user requests to prevent overload.
### 11. Secure Dependencies 📦 - Ensures third-party code is free from vulnerabilities.
### 12. Security Headers 📋 - Enhances site security against types of attacks like XSS.
### 13. Token Expiry ⏳ - Regularly expiring and renewing tokens prevents unauthorized access.
### 14. Use of Security Standards and Frameworks 📘 - Guides your API security strategy.
### 15. Web Application Firewall 🔥 - Protects your site from HTTP-specific attacks.
### 16. API Versioning 🔄 - Maintains different versions of your API for seamless updates.
4xx
Error Meaning
Code
400 Bad Request – Your request sucks
401 Unauthorized – Your API key is wrong
403 Forbidden – The kitten requested is hidden for administrators only
404 Not Found – The specified kitten could not be found
405 Method Not Allowed – You tried to access a kitten with an invalid method
406 Not Acceptable – You requested a format that isn’t json
410 Gone – The kitten requested has been removed from our servers
418 I’m a teapot
429 Too Many Requests – You’re requesting too many kittens! Slow down!
5xx
Error Meaning
Code
500 Internal Server Error – We had a problem with our server. Try again later.
503 Service Unavailable – We’re temporarily offline for maintenance. Please try again later.
👉 WEB
WEB1- Read
wEB2- Read and Write
WEB3- Read, Write and Ownership => AI
=> Blockchain
=> (Distributed System and Database)
=> LLVM and Web Assembly
=> IOT
🧨🧨🧨 PROGRESSIVE WEB APPS
🧨🧨🧨 BUNDLER
👉 AI/ML/DL BEST PRACTICE
### 1. Data cleaning, ingestion, storage and transformation.
### 2. Feature engineering.
### 3. Training models.
### 4. Engineering end-to-end ML systems.
### 5. Automating the production of results, decisions and actions of any kind.
👉 Deployment templates for common use-cases:
batch scoring data -> https://github.com/devopsengineerDan/python-bodywork-batch-job-project
deploying a prediction service with a REST API -> https://github.com/devopsengineerDan/python-bodywork-serve-model-project
scheduling a continuous-training pipeline -> https://github.com/devopsengineerDan/python-bodywork-ml-pipeline-project
👉 TOOLS
Data Science -> PyTorch, Tensorflow
ML Engineering -> DVC, DVC Pipelines
Data Engineering -> Dagster/Bodywork for docker and kubernetes use-case, Ibis, SQL Alchemy, Django SQL, Surrealdb
Software Engineering -> Pydantic
👉 PYTHON INTERPRETER APPLICATION USE-CASE
------------
👉 SIMULATION
👉 RESEARCH
👉 BACK-TESTING
------------
Cybersecurity
Computer Networks
Data Science, AI
[Web, Stand-alone, Console] Applications
I/O Processes
👉 PYTHON FRAMEWORKS
Django => Large- scale Projects (Cosistency and Standard libraries) - asgi
FastAPI => Small to Medium to Large -scale projects (Cosistency and Standard libraries) - asgi
Flask => Small to Medium to Large -scale projects projects ( Flexibility in libraries) - wsgi and asgi
✅ wsgi is used for few processes.
👉 SQL => Consistency, Control and Performance
1. SPECIFIC ✅ - Programming language runtime version
- TOML configuration
✅ - PYTHON 2.7 (GUI, GAMES, MOBILE) => Most libraries were written in Python 2.7
2. DATA TYPES
3. READ DOCUMENTATION
4. LOGGING ERROR FROM STANDARD PROGRAMMING LANGUAGE DEGUGGER(INTERPRETER/COMPILER)
5. LOGGING ERROR FROM THIRD-PARTIES LINTERS
6. REPRODUCE THE WHOLE CODE
7. Take a break.
✅ GENERAL
1. SECURE => FIXED
2. SIMPLE (FIXED TARGET & FIXED NUMBER OF RECURSION) => Accurate, Correct, Effective
3. ADAPTIVE => Efficient Data Interpretation/Transformation & Space Efficient
4. COMPLETE => High Performance & Backward Compatibility
5. PERFECT => FIXED
✅ SPECIFIC
***
RUNTIME => Number of primitive operations needed to solve a task/job.
PRIMITIVE OPERATIONS => Declarative and Boolean Conditional Statements.
***
STATIC (TIME COMPLEXITY)
1. Fixed size.
2. [Fixed and Reduced] Number of Iteration.
3. Completion.
------------
APPLICATION
- Python Speed Up Interpreter Performance.
- Use Data Structure Appropriately in terms of Performance.
- Use built-in functions (e.g) iter in python, match in Rust and C to achieve zero-cost abstraction.
***
DYNAMIC (SPACE COMPLEXITY)
1. [Reduced] Number of Iteration.
2. Stop OR Delete an Iteration.
3. Continuous.
-------------------
APPLICATION
-> [Use Data Structure Appropriately in terms of Space utilisation.]
- Tuples occupy less memory space than list in Python.
- List in Python and Vector in Rust have more functionality.
- Define object data (e.g) 32 bit or 64 bit.
***
ASYMPTOTIC GROWTH => Linear Algorithm is Best in the very long-term/dynamic.
BIG-O NOTATION => Check summary pdfs and BOOK
***
👉 DESIGN PATTERNS
DESIGN PATTERNS -> https://www.youtube.com/watch?v=FLmBqI3IKMA
1 Observer
2 Strategy
3 Facade
4 Singleton 5 Adapter/ Bridge
............................................................................................................................
1 Principal software engineer
2 Software Architect
3 Lead software engineer
4 Senior software engineer 5 Junior software engineer
-----------------------------------------------------------------------------------------------------------------------------
CREATIONAL -> singleton, prototype, builder, factory
STRUCTURAL -> facade, proxy
BEHAVIOURAL -> observer, iterator, mediator, state
Here’s a quick cheat sheet for lots of design patterns. The examples in this post are linked to Java examples however they are not limited to a single language implementation. Special thanks to Derek Banas.
-> Observer Design Pattern: Contains an object usually referred to as the subject that maintains a a list of dependent classes called observers. The subject object notifies all observers automatically of any state changes by calling their methods.
-> Factory Design Pattern: Creates objects that share the same super class type. Usually has a method named like makeTypeOfSuperClass which accepts an argument to tell it what to do.
-> Abstract Factory Design Pattern: Groups together individual factories. The objects that the these factories return share a common API.
-> Singleton Design Pattern: Used to eliminate the option of instantiating more than one object. Contains a static variable that holds the first instance of the object created. This object is always is returned. Typically returned by using a getInstance method. The constructor is set as a private function to prevent users instantiating it.
-> Builder Design Pattern: Builder objects contain methods that tell a factory how to build/configure an object.
-> Prototype Design Pattern: Creates new objects by copying other objects. Nice to use a clone factory that accepts an object to clone.
-> Decorator Design Pattern: Adds functionality by using many simple classes.
-> Command Design Pattern: Allows you to set aside a list of commands for later use. Each command is typically it’s own command object that implements the same command interface.
-> Adapter Design Pattern: Allows 2 incompatible objects that share a similar purpose to work transparently by abstraction.
-> Facade Design Pattern: Decouples or separates the client from all of the sub components. Main purpose is to simplify interfaces so you don’t have to worry about what’s going on under the hood. Similar to a Service Design Pattern except a Service typically communicates on external service.
-> Bridge Design Pattern: Defines 2 layers of abstraction. One for the target which can be extended for different types of receivers and one for the controls which can be extended to different type of controls that will be able to communicate with the targets.
-> Template Design Pattern: Contains a usually a final method that defines the steps of an algorithm. It allows subclasses to configure the template by overwriting methods.
-> Iterator Design Pattern: Provides a uniform way to access different collection types of objects. For instance, creating an iterator interface that your collections of Arrays, Lists, or Maps can implement so you can iterate over them the same way.
-> Composite Design Pattern: Allows you to attach individual objects and a composition of objects uniformly. For example, imagine a folder tree structure starting at root. This can be the root composite object (aka folder) where it accepts types of files and types of folders. File types have no child components attached to them but a folder can have many more files and even more groupings contained within.
-> Flyweight Design Pattern: Used when creating a lot (i.e.: 100,000 or more) of similar objects. Objects will be created from a factory that checks if a pre-existing object does not exist that shares a similar definition. If a pre- existing object exists with a similar configuration then the factory will return this rather creating a new object and reconfiguring.
-> State Design Pattern: Allows an object to change it’s behavior when it’s state changes. Each state implements a behavior associated with it’s state of context. Should be implemented with as few states as possible. Ie: Imagine a database connection that has two states (connected and disconnected) and a query method. The connection object execute the query method differently depending on it’s connection state.
-> Proxy Design Pattern: Provides a class with limited access to another class. This is done for security reasons.
-> Chain of Responsibility Design Pattern: Sends problem to an object and if that object can’t use it, then it sends it to an object chained to it that might. The object chained to it can have an additional object chained and will continue to run it’s course until it finds an object that can solve the problem.
-> Interpreter Design Pattern: This pattern solves a problem by using a context object to interpret an issue and find a class using reflection to return the answer.
-> Mediator Design Pattern: Handles communication between related objects without having to know anything about each other. Objects are typically constructed with the mediator passed in. Mediator will regulate interactions.
-> Memento Design Pattern: Useful for saving history states of an object. The object stored in state is referred to as the Memento. Mementos are managed by caretakers contain a list of each version for retrieval. Lastly, we have an originator which is responsible for communicating with the caretaker to save and restore states. Imagine a save, under, and redo scenario.
-> Visitor Design Pattern: Allows you to perform the same action on many objects of different types without much alteration to those classes. A concrete Visitor class will implement a common visit method for each type of visitor. Each visitable will implement a Visitable interface that implements an accept method. Upon usage, each visitable object can pass a type of visitor object to perform different functions. Extremely useful for creating new ways of Visitors without touching Visitable classes.
👉 SOLID DESIGN PRINCIPLES -->> Check my notes
What is SOLID Design
Michael Feathers can be credited for creating the mnemonic SOLID which is based on principles from Robert C. Martin’s paper, “Design Principles and Design Patterns”.
The principles are
1. Single Responsibility Principle
2. Open Closed Princple
3. Liskov's Substitutablilty Principle
4. Interface Segregation Principle
5. Dependency Inversion Principle
We will cover these in more detail shortly. The most important thing to note about the SOLID design principles is they are meant to be used holistically. Choosing one and just one is not going to do much for you. It's when used together you start to see the real value in these principles.
What is the Pythonic Way
Although there is no official definition for the Pythonic way a little Googling gives you several answers along this general vain.
"Above correct syntax Pythonic code follows the normally accepted conventions of the Python community, and uses the language in a way that follows the founding philosophy." - Derek D.
I think the most accurate description comes from the Zen of Python.
The Zen of Python, by Tim Peters
Beautiful is better than ugly.
Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Readability counts.
Special cases aren't special enough to break the rules.
Although practicality beats purity.
Errors should never pass silently.
Unless explicitly silenced.
In the face of ambiguity, refuse the temptation to guess.
There should be one-- and preferably only one --obvious way to do it.
Although that way may not be obvious at first unless you're Dutch.
Now is better than never.
Although never is often better than right now.
If the implementation is hard to explain, it's a bad idea.
If the implementation is easy to explain, it may be a good idea.
Namespaces are one honking great idea -- let's do more of those!
One Last Thing
Before I jump right into the principles and how they relate to the Zen of Python, there's one thing I want to do that no other SOLID tutorial does. Instead of using a different code snippet for each principle, We are going to work with a single code base and make it more SOLID as we cover each principle. Here is the code we are going to start with.
class FTPClient:
def __init__(self, **kwargs):
self._ftp_client = FTPDriver(kwargs['host'], kwargs['port'])
self._sftp_client = SFTPDriver(kwargs['sftp_host'], kwargs['user'], kwargs['pw'])
def upload(self, file:bytes, **kwargs):
is_sftp = kwargs['sftp']
if is_sftp:
with self._sftp_client.Connection() as sftp:
sftp.put(file)
else:
self._ftp_client.upload(file)
def download(self, target:str, **kwargs) -> bytes:
is_sftp = kwargs['sftp']
if is_sftp:
with self._sftp_client.Connection() as sftp:
return sftp.get(target)
else:
return self._ftp_client.download(target)
👉 1. Single Responsibility Principle (SRP)
Definition: Every module/class should only have one responsibility and therefore only one reason to change.
Relevant Zen: There should be one-- and preferably only one --obvious way to do things
The Single Responsibility Principle (SRP) is all about increasing cohesion and decreasing coupling by organizing code around responsibilities. It's not a big leap to see why that happens. If all the code for any given responsibility is in a single place that's cohesive and while responsibilities may be similar they don't often overlap. Consider this non-code example. If it is your responsibility to sweep and my responsibility to mop there's no reason for me to keep track of whether or not the floor has been swept. I can just ask you, "has the floor been swept"? and base my action according to your response.
I find it useful to think of responsibilities as use cases, which is how our Zen comes into play. Each use case should only be handled in one place, in turn, creating one obvious way to do things. This also satisfies the, "one reason to change" portion of the SRP's definition. The only reason this class should change is if the use case has changed.
Examining our original code we can see the class does not have a single responsibility because it has to manage connection details for an FTP, and SFTP server. Furthermore, the methods don't even have a single responsibility because both have to choose which protocol they will be using. This can be fixed by splitting the FTPClient class into 2 classes each with one of the responsibilities.
class FTPClient:
def __init__(self, host, port):
self._client = FTPDriver(host, port)
def upload(self, file:bytes):
self._client.upload(file)
def download(self, target:str) -> bytes:
return self._client.download(target)
class SFTPClient(FTPClient):
def __init__(self, host, user, password):
self._client = SFTPDriver(host, username=user, password=password)
def upload(self, file:bytes):
with self._client.Connection() as sftp:
sftp.put(file)
def download(self, target:str) -> bytes:
with self._client.Connection() as sftp:
return sftp.get(target)
One quick change and our code is already feeling much more Pythonic. The code is sparse, and not dense, simple not complex, flat and not nested. If you're not on board yet think about how the original code would look with error handling compared to the code following SRP.
👉 2. Open Closed Principle (OCP)
Definition: Software Entities (classes, functions, modules) should be open for extension but closed to change.
Relevant Zen: Simple is better than complex. Complex is better than complicated.
Since the definition of change and extension are so similar it is easy to get overwhelmed by the Open Closed Principle. I've found the most intuitive way to decide if I'm making a change or extension is to think about function signatures. A change is anything that forces calling code to be updated. This could be changing the function name, swapping the order of parameters, or adding a non-default parameter. Any code that calls the function would be forced to change in accordance with the new signature. An extension, on the other hand, allows for new functionality, without having to change the calling code. This could be renaming a parameter, adding a new parameter with a default value, or adding the *arg, or **kwargs parameters. Any code that calls the function would still work as originally written. The same rules apply to classes as well.
Here is an example of adding support for bulk operations.
Your gut reaction is probably to add a upload_bulk and download_bulk functions to the FTPClient class. Fortunately, that is also a SOLID way to handle this use case.
class FTPClient:
def __init__(self, host, port):
... # For this example the __init__ implementation is not significant
def upload(self, file:bytes):
... # For this example the upload implementation is not significant
def download(self, target:str) -> bytes:
... # For this example the download implementation is not significant
def upload_bulk(self, files:List[str]):
for file in files:
self.upload(file)
def download_bulk(self, targets:List[str]) -> List[bytes]:
files = []
for target in targets:
files.append(self.download(target))
return files
In this case, it's better to extend the class with functions than extend through inheritance, because a BulkFTPClient child class would have to change the function signature for download reflecting it returns a list of bytes rather than just bytes, violating the Open Closed Principle as well as Liskov's Substituitability Principle.
👉 3. Liskov's Substituitability Principle (LSP)
Definition: If S is a subtype of T, then objects of type T may be replaced with objects of Type S.
Relevant Zen: Special cases aren’t special enough to break the rules.
Liskov's Substituitablity Principle was the first of the SOLID design principles I learned of and the only one I learned at University. Maybe that's why this one is so intuitive to me. A plain English way of saying this is, "Any child class can replace its parent class without breaking functionality."
You may have noticed all of the FTP client classes so far have the same function signatures. That was done purposefully so they would follow Liskov's Substituitability Principle. An SFTPClient object can replace an FTPClient object and whatever code is calling upload, or download, is blissfully unaware.
Another specialized case of FTP file transfers is supporting FTPS (yes FTPS and SFTP are different). Solving this can be tricky because we have choices. They are:
1. Add upload_secure, and download_secure functions.
2. Add a secure flag through **kwargs.
3. Create a new class, FTPSClient, that extends FTPClient.
For reasons that we will get into during the Interface Segregation, and Dependency Inversion principles the new FTPSClient class is the way to go.
class FTPClient:
def __init__(self, host, port):
...
def upload(self, file:bytes):
...
def download(self, target:str) -> bytes:
...
class FTPSClient(FTPClient):
def __init__(self, host, port, username, password):
self._client = FTPSDriver(host, port, user=username, password=password)
This is exactly the kind of edge case inheritance is meant for and following Liskov's makes for effective polymorphism. You'll note than now FTPClient's can be replaced by an FTPSClient or SFTPClient. In fact, all 3 are interchangeable which brings us to interface segregation.
👉 4. Interface Segregation Principle (ISP)
Definition: A client should not depend on methods it does not use.
Relevant Zen: Readability Counts && complicated is better than complex.
Unlike Liskov's, The Interface Segregation Principle was the last and most difficult principle for me to understand. I always equated it to the interface keyword, and most explanations for SOLID design don't do much to dispel that confusion. additionally, most guides I've found try to break everything up into tiny interfaces most often with a single function per-interface because "too many interfaces are better than too few".
There are 2 issues here, first Python doesn't have interfaces, and second languages like C# and Java that do have interfaces, breaking them up too much always ends up with interfaces implementing interfaces which can get complex and complex is not Pythonic.
First I want to explore the too small of interfaces problem by looking at some C# code, then we'll cover a Pythonic approach to ISP. If you agree or are just choosing to trust me that super small interfaces are not the best way to segregate your interfaces feel free to skip to the Pythonic Solution below.
# Warning here be C# code
public interface ICanUpload {
void upload(Byte[] file);
}
public interface ICanDownload {
Byte[] download();
}
class FTPClient : ICanUpload, ICanDownload {
public void upload(Byte[] file) {
...
}
public Byte[] download(string target) {
...
}
}
The trouble starts when you need to specify the type of a parameter that implements both the ICanDownload and ICanUpload interfaces. The code snippet below demonstrates the problem.
class ReportGenerator {
public Byte[] doStuff(Byte[] raw) {
...
}
public void generateReport(/*What type should go here?*/ client) {
raw_data = client.download('client_rundown.csv');
report = this.doStuff(raw_data);
client.upload(report);
}
}
In the generateReport function signature you either have to specify the concrete FTPClient class as the parameter type, which violates the Dependency Inversion Principle or create an interface that implements both ICanUpload, and ICanDownload interfaces. Otherwise, an object that just implements ICanUpload could be passed in but would fail the download call and vice-versa with an object only implementing the ICanDownload interface. The normal answer is to create an IFTPClient interface and let the generateReport function depend on that.
public interface IFTPClient: ICanUpload, ICanDownload {
void upload(Byte[] file);
Byte[] download(string target);
}
That works, except we are still depending on FTP clients. What if we want to start storing our reports in S3?
The Pythonic Solution
To me, ISP is about making reasonable choices for how other developers will interface with your code. That's right it's more related to the I in API and CLI than it is the interface keyword. This is also where the "Readability Counts" from the Zen of Python is a driving force. A good interface will follow the semantics of the abstraction and match the terminology making the code more readable.
Let's look at how we can add an S3Client since it has the same upload/download semantics as the FTPClients. We want to keep the S3Clients signature for upload and download consistent, but it would be nonsense for the new S3Client to inherit from FTPClient. After all, S3 is not a special case of FTP. What FTP and S3 do have in common is that they are file transfer protocols and these protocols often share a similar interface as seen in this example. So instead of inheriting from FTPClient it would be better to tie these classes together with an abstract base class, the closest thing Python has to an interface.
We create a FileTransferClient which becomes our interface and all of our existing clients now inherit from that rather than inheriting from FTPClient. This forces a common interface and allows us to move bulk operations into their own interface since not every file transfer protocol will support them.
from abc import ABC
class FileTransferClient(ABC):
def upload(self, file:bytes):
pass
def download(self, target:str) -> bytes:
pass
def cd(self, target_dir):
pass
class BulkFileTransferClient(ABC):
def upload_bulk(self, files:List[bytes]):
pass
def download_bulk(self, targets:List[str]):
pass
What does this afford us though...well this.
class FTPClient(FileTransferClient, BulkFileTransferClient):
...
class FTPSClient(FileTransferClient, BulkFileTransferClient):
...
class SFTPClient(FileTransferClient, BulkFileTransferClient):
...
class S3Client(FileTransferClient):
...
class SCPClient(FileTransferClient):
...
Oh Man! is that good code or what. We even managed to squeeze in a SCPClient and kept bulk actions as their own mixin. All this ties together nicely with Dependency Injection, a technique used for the Dependency Inversion Principle.
👉 5. Dependency Inversion Principle (DIP)
Definition: High-level modules should not depend on low-level modules. They should depend on abstractions and abstractions should not depend on details, rather details should depend on abstractions.
Relevant Zen: Explicit is Better than Implicit
This is what ties it all together. Everything we've done with the other SOLID principles was to get to a place where we are no longer dependent on a detail, the underlying file transfer protocol, being used to move files around. We can now write code around our business rules without tying them to a specific implementation. Our code satisfies both requirements of dependency inversion.
Our high-level modules no longer need to depend on a low-level module like FTPClient, SFTPClient, or S3Client, instead, they depend on an abstraction FileTransferClient. We are depending on the abstraction of moving files not the detail of how those files are moved.
Our abstraction FileTransferClient is not dependent on protocol specific details and instead, those details depend on how they will be used through the abstraction (i.e. that files can be uploaded or downloaded).
Here is a example of Dependency Inversion at work.
def exchange(client:FileTransferClient, to_upload:bytes, to_download:str) -> bytes:
client.upload(to_upload)
return client.download(to_download)
if __name__ == '__main__':
ftp = FTPClient('ftp.host.com')
sftp = FTPSClient('sftp.host.com', 22)
ftps = SFTPClient('ftps.host.com', 990, 'ftps_user', 'P@ssw0rd1!')
s3 = S3Client('ftp.host.com')
scp = SCPClient('ftp.host.com')
for client in [ftp, sftp, ftps, s3, scp]:
exchange(client, b'Hello', 'greeting.txt')
Conclusion
There you have it a SOLID implementation that is also very Pythonic. I'm hoping you've at least warmed up to SOLID if you hadn't before, and for those of you that are learning Python and not sure how to continue writing SOLID code this has been helpful. This, of course, was a curated example that I knew would lend itself to my argument, but in writing this I was still surprised how much changed along the way. Not every problem will fit this exact breakdown, but I've tried to include enough reasoning behind
my decisions that you can choose the most SOLID & Pythonic implementation in the future.
👉 PYCHARM IDE SLOW PROCESS SOLUTION
nano /etc/sysctl.conf
fs.inotify.max_user_watches = 524288
👉 PYINSTALLER
pyinstaller --onedir --onefile --name=passwordlocker --windowed "/home/dan/PycharmProjects/dancunmanyinsa/manage.py"
OR
In the project directory, when the development virtual environment is deactivated, type this on the terminal
pyinstaller main.py --onefile --noconsole
PYINSTALLER -> https://www.youtube.com/watch?v=p3tSLatmGvU
👉 KAZAM DEMONSTRATION
sudo apt install kazam
On how to use Kazam Screencaster https://www.youtube.com/watch?v=ADPPpY6ZEI8
👉 UI/UX DESIGN (HUGO for WEBSITES)
👉 Simple -> https://www.youtube.com/watch?v=0RKpf3rK57I
👉 Advanced -> https://www.youtube.com/watch?v=LIFvgrRxdt4
HUGO -> hugo -> To create public folder with static html,css,xml pages
1. create a project folder for github pages githubname.github.io -(Contain website files i.e html, xml, css and media folder to be deployed and displayed on client)
#change name from public to githubname.github.io (check github creating github pages)
2. push to github
3. create a blog -> (Contain content files and hugo theme)
hugo new site blog
#change name to your preference
4. cd personal-blog/
#if it's a new project proceed.
#if it's not a new project, and you want to update website, add github changes of (folder, file) of the online github repository to this personal blog directory.
5. git init
6. cd.. then cd themes/
7. git submodule add https://github.com/vaga/hugo-theme-m10c.git
8. cd.. then cd personal-blog/
9. git submodule add https://github.com/devopsengineerDan/devopsengineerdan.github.io.git
#creates a folder called public and inserts devopsengineerdan.github.io website in the directory
10. hugo new posts/best-practices-for-pyspark-ETL.md
#Hugo supports Go language which supports yaml, hence in updating the drafts in local server leave it in yaml format in the upper section of the markdown file.
11. hugo server OR hugo server --watch --verbose
12. #For m10c theme the author used toml as configuration file, hence change yaml format in the upper section of the markdown file to toml format
hugo -D then hugo -> To build and update the website from the recent updates in the blog contents and other file changes.
13. push to github -> This is the Blog repository
NOTE -> If websites fails to update take the new posts htmls files generated and add them to the website folder cloned from git. Then push the new changes
👉 NGROK => https://www.youtube.com/watch?v=hs7GLsHQCPQ
ngrok config add-authtoken 2V4ZbMGhudJBSB8F1Z303Qf0yFc_2i9fWkV8ECNg6bL8zfABS
👉 GITHUB ADD IMAGE on README.md
Go to README.md
Select Issues
Select New Issue
Select Paste, drop, or click to add files
=> Once the link has been generated copy. Go to your README.md section and paste it there.
👉 GITHUB
Create Personal Access Token on GitHub
From your GitHub account, go to Settings => Developer Settings => Personal Access Token => Generate New Token (Give your password) => Fillup the form => click Generate token => Copy the generated Token, it will be something like ghp_sFhFsSHhTzMDreGRLjmks4Tzuzgthdvfsrta
GITHUB conardmomanyimoruri
devopsengineerDan: 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
GIT FIRST ATTEMPT
git config --global user.name "DancunManyinsa"
git config --global user.email "[email protected]"
git config --global user.name "devopsengineerDan"
git config --global user.email "[email protected]"
git config -l
Once GIT is configured, we can begin using it to access GitHub. Example:
$ git clone https://github.com/YOUR-USERNAME/YOUR-REPOSITORY
> Cloning into `Spoon-Knife`...
$ Username for 'https://github.com' : username
$ Password for 'https://github.com' : give your personal access token here
Now cache the given record in your computer to remembers the token:
$ git config --global credential.helper cache
If needed, anytime you can delete the cache record by:
$ git config --global --unset credential.helper
$ git config --system --unset credential.helper
GIT PUSHING PROJECT TO GITHUB
git init
git add README.md
git add .
git commit -m "first commit"
git branch -M main
******************************************************************************************************************************************************************************
#To avoid conflicts
FIRST OPTION
git checkout -b secondbranch ->Creates another branch named secondbranch
OR
git branch secondbranch ->Creates another branch named secondbranch
git
git remote add origin https://github.com/devopsengineerDan/Driver-Drowsiness-Detection.git
git push -u origin main
SECOND OPTION
git clone "_git repository link here_"
git init
git add .
git commit -m "first commit"
git branch -M main
git remote add origin "_git repository link here_"
git push -u origin main
******************************************************************************************************************************************************************************
👉 GIT PUSHING PROJECT TO GITHUB TO GH-PAGES
Create project projectname.github.io
git add README.md
git add .
git commit -m "first commit"
git branch -M main
git remote add origin https://github.com/devopsengineerdan/devopsengineerdan.github.io.git
git push -u origin main
👉 SETTING UP DOMAIN NAME DEPLOYING WEBAPPLICATION/WEBSITE
====>https://www.youtube.com/watch?v=p1QU3kLFPdg
👉 KIVY
Kivy Installation
sudo apt-get install python3-distutils
sudo python3 get-pip.py
sudo apt-get install -y python3-pip build-essential git python3 python3-dev
sudo apt-get install -y libsdl2-dev libsdl2-image-dev libsdl2-mixer-dev libsdl2-ttf-dev libportmidi-dev libswscale-dev libavformat-dev libavcodec-dev zlib1g-dev
sudo apt-get install cython
sudo pip3 install kivy
python3 main.py - code is available
sudo apt-get install libltdl-dev libffi-dev libssl-dev autoconf autotools-dev
sudo apt install -y git zip unzip openjdk-8-jdk python3-pip autoconf libtool pkg-config zlib1g-dev libncurses5-dev libncursesw5-dev libtinfo5 cmake libffi-dev libssl-dev pip3 install --user --upgrade Cython==0.29.19 virtualenv # the --user should be removed if you do this in a venv
add the following line at the end of your ~/.bashrc file
export PATH=$PATH:~/.local/bin/ -> Temporary export and it's deleted after exiting terminal ->For permanent export use bash
Kivy app to apk (Buildozer)
git clone https://github.com/kivy/buildozer.git
cd buildozer
sudo python3 setup.py install
->Go to the project directory and run
buildozer init -> Creates spec file
buildozer android debug -> Creates apk file (check for the bin directory -> apk)
Pages 2
Home
Kivy app to android apk
Clone this wiki locally
buildozer init
######Edit spec file########
# (str) Title of your application
title = Driver-Drowsiness-Detection Application
# (str) Package name
package.name = Driver-Drowsiness-Detection Package
# (str) Package domain (needed for android/ios packaging)
package.domain = org.dancun
# (list) Source files to include (let empty to include all the files)
source.include_exts = py,png,jpg,kv,atlas
# comma separated e.g. requirements = sqlite3,kivy
requirements = python3, kivy, kivymd, cv2, dlib, playsound, scipy, imutils, pygame
->TO ENABLE DEBUG MODE
# (str) Android logcat filters to use
android.logcat_filters = *:S python:D
buildozer android deploy run
👉 HEROKU ((( H - Conardmomanyi@1 ))) GITHUB -> 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
web: gunicorn waweru.wsgi --log-file -
*********************************************************************
sudo snap install --classic heroku
wget -qO- https://toolbelt.heroku.com/install-ubuntu.sh | sh
*********************************************************************
******CLONE FROM HEROKU*******
heroku git:clone -a mathematician/your_app_name
Collectstatic error while deploying Django app to Heroku
1 disable the collectstatic during a deploy
$ heroku config:set DISABLE_COLLECTSTATIC=1
2 deploy
$ git push heroku master
3 run migrations (django 1.10 added at least one)
$ heroku run python3 manage.py migrate
4 run collectstatic using bower
$ heroku run 'bower install --config.interactive=false;grunt prep;python3 manage.py collectstatic --noinput'
5 enable collecstatic for future deploys
$ heroku config:unset DISABLE_COLLECTSTATIC
6 try it on your own (optional)
$ heroku run python manage.py collectstatic
👉 GIT COLLABORATION Link http://adzumi.co.ke/blog/github_contribution
As a new developer, contributing to a project can be scary. I get it, I was there too. It took me way too long to make my first Pull Request.
Contributing is also a great way to learn more about social coding on Github, new technologies and their ecosystems and how to make constructive, helpful bug reports, feature requests and the noblest of all contributions.
STEP 1: SET UP A WORKING COPY ON YOUR COMPUTER
First of all, you need a local fork of the project, so go ahead and press the �fork� button on GitHub. This will create a copy of the repository in your own GitHub account and you�ll see a note that it�s been forked underneath the project name:
Now you need a copy locally, so find the clone URL in the right-hand column and use that to clone locally using a terminal:
$ git clone https://github.com/lawrence254/StudentEngagement.git
Which will do something like this:
Now navigate to the project's directory:
$ cd StudentEngagement
Finally, in this stage, you need to set up a new remote that points to the original project so that you can grab any changes and bring them into your local copy. Firstly click on the link to the original repository, it�s labeled �Forked from� at the top of the GitHub page. This takes you back to the projects main GitHub page, so you can find the Clone URL and use it to create the new remote, which we�ll call upstream.
$ git remote add upstream https://github.com/lawrence254/StudentEngagement.git
You now have two remotes for this project on disk:
origin which points to your GitHub fork of the project. You can read and write to this remote.
upstream which points to the main project�s GitHub repository. You can only read from this remote.
STEP 2: MAKE SOME CHANGES
This is the fun bit where you get to contribute to the project. The number one rule is to put each piece of work on its own branch. If the project is using git-flow, then it will have both a master and a development branch. The general rule is that if you are adding a new feature then branch from development. If the project only has a master branch, the branch from that.
For this example, the forked repository had a number of branches already, one of them called 'lawrence'. This is the branch the owner of the repository has been working on. We�ll assume we�re adding a feature in StudentEngagement, so we branch to that branch:
$ git checkout lawrence
$ git pull upstream lawrence && git push origin lawrence
Firstly we ensure we�re on the master branch. Then the git pull command will sync our local copy with the upstream project and the git push syncs it to our forked GitHub project. Finally, we checkout to the 'lawrence' branch.
Now you can add a feature(Or fix a bug).
If the project has tests, run them to ensure you haven�t broken anything. You may also add a new test to show that your change fixes the original problem.
STEP 3: CREATE A PULL REQUEST
To create a Pull Request you need to push your branch to the origin remote and then press some buttons on GitHub.
To push the new branch:
$ git push origin lawrence
Now swap back to the browser and navigate to your fork of the project (https://github.com/YomZsamora/StudentEngagement in my case) and you�ll see that the branch is listed at the top with a handy �Compare & pull request� button:
Go ahead and press the button!
On this page, ensure that the �base fork� points to the correct repository and branch. Then ensure that you provide a good, succinct title for your pull request and explain why you have created it in the description box. Add any relevant issue numbers if you have them.
If you scroll down a bit, you�ll see a diff of your changes. Double check that it contains what you expect.
Once you are happy, press the �Create pull request� button and you�re done.
For your work to be integrated into the project, the maintainers will review your work and either request changes or merge it.
To Summarize
That'�s all there is to it. The fundamentals are:
Fork the project & clone locally.
Create an upstream remote and sync your local copy.
Ensure you're working on the correct branch.
Do the work, write good commit messages and read the CONTRIBUTING file if there is one.
Push to your origin repository.
Create a new Pull Request in GitHub.
👉 PYTHON INSTALLATION
Install pip3 sudo apt install python3-pip
python -m pip install --upgrade pip
OR python3 -m pip install --upgrade pip
Install python dev sudo apt-get install build-essential libssl-dev libffi-dev python3-dev
python3 -m pip install python-dev-tools --user --upgrade
sudo apt install python3-setuptools
sudo apt install libpq-dev
pip install python-dev-tools
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
*****NOTE*****
bash_profile is executed for login shells, while . bashrc is executed for interactive non-login shells. When you login (type username and password) via console, either sitting at the machine, or remotely via ssh: . bash_profile is executed to configure your shell before the initial command prompt.
WARNING: The scripts pip, pip3, pip3.10 and pip3.8 are installed in '/home/dan/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
HENCE IN SOLVING THIS ERROR WE USE:
$ nano /home/dan/.bash_profile
export PATH="$PATH:/Users/dan/Library/Python/3.8/bin"
Save-> Ctrl + o . Then, Enter
Exit-> Ctri + x
$ echo $PATH
CHECK BEST METHOD
OR (BEST METHOD)
$ export PATH="$HOME/.python3/bin:$PATH" -> AND ADD python/bin FOR KALI LINUX
$ echo $PATH
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
👉 MICROPYTHON INSTALLATION
sudo apt install micropython
sudo apt install build-essential libffi-dev git pkg-config
👉 BEST PRACTICE FOR PIP
Install pip3: sudo apt install python3-pip
python -m pip install --upgrade pip
OR python3 -m pip install --upgrade pip
Freeze Dependencies: pip freeze
pip freeze > requirements.txt
Install Dependencies: pip3 install -r requirements.txt
OR
python3 -m pip install -r requirements. txt
Upgrade Dependencies: pip3 install -r requirements.txt --upgrade
Deactivate: deactivate
👉 OLD Python env
Install _ONCE_ -> 1st Option
sudo apt-get install python3-venv
pip install --force-reinstall virtualenv
-> 2nd Option
pip install virtualenv
**Go to Project Folder**
Create env: virtualenv -p /usr/bin/python3.8 venv
Activate env: source venv/bin/activate
👉 NEW Python env
Install pip3 _ONCE_ sudo apt install python3-pip
Install _ONCE_ -> 1st Option
pip3 install pipenv
export PATH="$HOME/.pipenv/bin:$PATH"
echo $PATH
-> 2nd Option
sudo apt install python3-venv
export PATH="$HOME/.pipenv/bin:$PATH"
echo $PATH
**Go to Project Folder**
Create env: python3 -m venv my-project-env
Activate env: source my-project-env/bin/activate
Deactivate env: deactivate
*************************
$ which python3.8
*************************
Create: pipenv --python /usr/bin/python3.8
Activate: pipenv shell
Freeze Dependencies: pip freeze
pip freeze > requirements.txt
Install Dependencies: pipenv install -r requirements.txt
Upgrade Dependencies: pipenv install -r requirements.txt --upgrade
Deactivate: deactivate
👉 RUST INSTALLATION
Install Rust and Cargo check online
$ curl https://sh.rustup.rs -sSf | sh
$ source $HOME/.cargo/env
$ export PATH="$HOME/.cargo/bin:$PATH"
$ echo $PATH
👉 NAIVE APPROACH
cargo new portfolio - create "project folder" portfolio
cd portfolio
cd src
gedit main.rs -> create and edit main.rs file
rustc main.rs -> build program
./main -> compile and run program
👉 CARGO BEST PRACTICE 👉 CARGO -> Rust Build System(Build and Compile) and Package Manager(Manages dependencies) 👉 CARGO COMMANDS
cargo init <project-name>
1st Option:(ONE SIMPLE COMMAND)
cargo run
✅ cargo --release
2nd Option:(BEST PRACTICE)
cargo check
cargo build -> build and compile
cargo test
cargo test --clippy ->Best Practice
cargo run ->check, build and compile, execute
cargo clean
OTHER COMMANDS
cargo fix -> Ensure your use case is defined whether you are doing parallelism or concurrency.
cargo bench
👉 RUST ENVIRONMENT => Rust uses cache memory unlike shared memory in Python
Dependencies aren’t normally installed into the cargo directory in Rust. There is a cache directory that is shared, but it’s a true cache and therefore not a problem in the way a shared lib directory is.
Since dependencies aren’t shared there’s no real need for isolated per-project rust installs. Cargo will pull the correct dependencies for your project when you build it and link only those dependencies into the final program, regardless of what other programs you may have built before.
👉 C DEVELOPMENT TOOLS INSTALLATION
sudo apt install build-essential
sudo apt install cmake
👉 CONAN PACKAGE MANAGER INSTALLATION
pip install conan -> C/C++ PACKAGE MANAGER
export PATH="$HOME/.conan/bin:$PATH"
echo $PATH
👉 BUILD AND COMPILE USING GCC COMPILER
Build -> gcc main.c -o main
Compile ->./main
Build and Run/Compile/Execute an entire COMPLETED project using make -> Check - https://github.com/devopsengineerDan/worldOfWalls-game
Install -> sudo make install
sudo apt-get -y install cmake
Build -> make
Run -> make run
👉 CLANG
Install Clang on Debian
Step 1. First, make sure that all your system packages are up-to-date by running the following apt commands in the terminal.
sudo apt update
sudo apt upgrade
Step 2. Installing Clang on Ubuntu 22.04.
Ubuntu 22.04 LTS typically includes the latest stable Clang version in its repositories. To verify the available Clang versions, run:
apt show clang
Now install the chosen Clang version using the package manager:
sudo apt install llvm clang
To verify that Clang has been installed successfully, check the installed version:
clang --version
Step 3. Setting Up Clang Environment Variables.
To ensure Clang works optimally, we need to configure the necessary environment variables.
Set the PATH Variable.
Add the Clang binary directory to the PATH environment variable so that you can access Clang from any location in the terminal.
echo 'export PATH=/usr/bin/:$PATH' >> ~/.bashrc
source ~/.bashrc
Step 2: Set the CC and CXX Variables.
Set the CC and CXX environment variables to point to the Clang compiler for C and C++ respectively.
echo 'export CC=clang' >> ~/.bashrc
echo 'export CXX=clang++' >> ~/.bashrc
source ~/.bashrc
Step 4. Compiling C/C++ Code with Clang.
Create a new file named hello.c for C or hello.cpp for C++ and add the following code:
#include <stdio.h>
int main() {
printf("Hello, Clang on Ubuntu 22.04 LTS!\n");
return 0;
}
Use Clang to compile the code into an executable binary:
clang -o hello hello.c # For C
clang++ -o hello hello.cpp # For C++
Run the compiled binary to see the output:
./hello
Step 5. Tips for Optimizing Clang Performance.
To further optimize your code’s performance with Clang, consider using specific compiler flags and techniques.
Use Clang’s Optimization Flags:
-O1, -O2, -O3: Optimize the code for different levels of performance (higher levels may increase compilation time).
-march=native: Generate code optimized for the host machine’s architecture.
Code Refactoring:
Use const correctly to help Clang perform additional optimizations.
Avoid unnecessary code duplication.
Prefer inline functions over macros to enable better type-checking and optimizations.
Enable Link-Time Optimization (LTO):
Use the -flto flag during compilation to enable link-time optimization.
*******RUNNING CONAN******
Conan C/C++ Package Manager Documentation https://docs.conan.io/en/latest/
The source files to recreate this project are available in the example repository in GitHub. You can skip the manual creation of the folder and sources with this command:
$ git clone https://github.com/conan-io/examples.git && cd examples/libraries/poco/md5
Create the following source file inside a folder. This will be the source file of our application:
md5.cpp¶
#include "Poco/MD5Engine.h"
#include "Poco/DigestStream.h"
#include <iostream>
int main(int argc, char** argv){
Poco::MD5Engine md5;
Poco::DigestOutputStream ds(md5);
ds << "abcdefghijklmnopqrstuvwxyz";
ds.close();
std::cout << Poco::DigestEngine::digestToHex(md5.digest()) << std::endl;
return 0;
}
We know that our application relies on the Poco libraries. Let’s look for it in the ConanCenter remote, going to https://conan.io/center, and typing “poco” in the search box. We will see that there are some different versions available:
poco/1.8.1
poco/1.9.3
poco/1.9.4
...
NOTE
The Conan client contains a command to search in remote repositories, and we could try $ conan search poco --remote=conancenter. You can perfectly use this command to search in your own repositories, but note that at the moment this might timeout in ConanCenter. The infrastructure is being improved to support this command too, but meanwhile using the ConanCenter UI is recommended.
Let’s use this poco/1.9.4 version for our MD5 calculator app, creating a conanfile.txt inside our project’s folder with the following content:
conanfile.txt¶
[requires]
poco/1.9.4
[generators]
cmake
In this example we are using CMake to build the project, which is why the cmake generator is specified. This generator creates a conanbuildinfo.cmake file that defines CMake variables including paths and library names that can be used in our build. Read more about Generators.
Next step: We are going to install the required dependencies and generate the information for the build system:
Important
If you are using GCC compiler >= 5.1, Conan will set the compiler.libcxx to the old ABI for backwards compatibility. In the context of this getting started example, this is a bad choice though: Recent gcc versions will compile the example by default with the new ABI and linking will fail without further customization of your cmake configuration. You can avoid this with the following commands:
$ conan profile new default --detect # Generates default profile detecting GCC and sets old ABI
$ conan profile update settings.compiler.libcxx=libstdc++11 default # Sets libcxx to C++11 ABI
You will find more information in How to manage the GCC >= 5 ABI.
$ mkdir build && cd build
$ conan install ..
...
Requirements
bzip2/1.0.8 from 'conancenter' - Downloaded
expat/2.2.9 from 'conancenter' - Downloaded
openssl/1.1.1g from 'conancenter' - Downloaded
pcre/8.41 from 'conancenter' - Downloaded
poco/1.9.4 from 'conancenter' - Cache
sqlite3/3.31.1 from 'conancenter' - Downloaded
zlib/1.2.11 from 'conancenter' - Downloaded
Packages
bzip2/1.0.8:5be2b7a2110ec8acdbf9a1cea9de5d60747edb34 - Download
expat/2.2.9:6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7 - Download
openssl/1.1.1g:6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7 - Download
pcre/8.41:20fc3dfce989c458ac2372442673140ea8028c06 - Download
poco/1.9.4:73e83a21ea6817fa9ef0f7d1a86ea923190b0205 - Download
sqlite3/3.31.1:4559c5d4f09161e1edf374b033b1d6464826db16 - Download
zlib/1.2.11:6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7 - Download
zlib/1.2.11: Retrieving package f74366f76f700cc6e991285892ad7a23c30e6d47 from remote 'conancenter'
Downloading conanmanifest.txt completed [0.25k]
Downloading conaninfo.txt completed [0.44k]
Downloading conan_package.tgz completed [83.15k]
Decompressing conan_package.tgz completed [0.00k]
zlib/1.2.11: Package installed f74366f76f700cc6e991285892ad7a23c30e6d47
zlib/1.2.11: Downloaded package revision 0
...
poco/1.9.4: Retrieving package 645aaff0a79e6036c77803601e44677556109dd9 from remote 'conancenter'
Downloading conanmanifest.txt completed [48.75k]
Downloading conaninfo.txt completed [2.44k]
Downloading conan_package.tgz completed [5128.39k]
Decompressing conan_package.tgz completed [0.00k]
poco/1.9.4: Package installed 645aaff0a79e6036c77803601e44677556109dd9
poco/1.9.4: Downloaded package revision 0
conanfile.txt: Generator cmake created conanbuildinfo.cmake
conanfile.txt: Generator txt created conanbuildinfo.txt
conanfile.txt: Generated conaninfo.txt
conanfile.txt: Generated graphinfo
Conan installed our Poco dependency but also the transitive dependencies for it: OpenSSL, zlib, sqlite and others. It has also generated a conanbuildinfo.cmake file for our build system.
Warning
There are prebuilt binaries for several mainstream compilers and versions available in Conan Center repository, such as Visual Studio 14, 15, Linux GCC 4.9 and Apple Clang 3.5. Up to >130 different binaries for different configurations can be available in ConanCenter. But if your current configuration is not pre-built in ConanCenter, Conan will raise a “BinaryMissing” error. Please read carefully the error messages. You can build the binary package from sources using 'conan install .. --build=missing' , it will succeed if your configuration is supported by the recipe (it is possible that some ConanCenter recipes fail to build for some platforms). You will find more info in the 'Building with other configurations' section.
Now let’s create our build file. To inject the Conan information, include the generated conanbuildinfo.cmake file like this:
CMakeLists.txt¶
cmake_minimum_required(VERSION 2.8.12)
project(MD5Encrypter)
add_definitions("-std=c++11")
include(${CMAKE_BINARY_DIR}/conanbuildinfo.cmake)
conan_basic_setup()
add_executable(md5 md5.cpp)
target_link_libraries(md5 ${CONAN_LIBS})
Note
There are other integrations with CMake, like the cmake_find_package generators, that will use the find_package() CMake syntax (see CMake section).
Now we are ready to build and run our MD5 app:
(win)
$ cmake .. -G "Visual Studio 16"
$ cmake --build . --config Release
(linux, mac)
$ cmake .. -G "Unix Makefiles" -DCMAKE_BUILD_TYPE=Release
$ cmake --build .
...
[100%] Built target md5
$ ./bin/md5
c3fcd3d76192e4007dfb496cca67e13b
Inspecting Dependencies
The retrieved packages are installed to your local user cache (typically .conan/data), and can be reused from this location for other projects. This allows to clean your current project and continue working even without network connection. To search for packages in the local cache run:
$ conan search "*"
Existing package recipes:
openssl/1.0.2t
poco/1.9.4
zlib/1.2.11
...
To inspect the different binary packages of a reference run:
$ conan search poco/1.9.4@
Existing packages for recipe poco/1.9.4:
Package_ID: 645aaff0a79e6036c77803601e44677556109dd9
[options]
cxx_14: False
enable_apacheconnector: False
enable_cppparser: False
enable_crypto: True
enable_data: True
...
The @ symbol at the end of the package name is important to search for a specific package. If you don’t add the @, Conan will interpret the argument as a pattern search and return all the packages that match the poco/1.9.4 pattern and may have different user and channel.
To inspect all your current project’s dependencies use the conan info command by pointing it to the location of the conanfile.txt folder:
$ conan info ..
Building with other configurations
In this example, we have built our project using the default configuration detected by Conan. This configuration is known as the default profile.
A profile needs to be available prior to running commands such as conan install. When running the command, your settings are automatically detected (compiler, architecture…) and stored as the default profile. You can edit these settings ~/.conan/profiles/default or create new profiles with your desired configuration.
For example, if we have a profile with a 32-bit GCC configuration in a file called gcc_x86, we can run the following:
$ conan install .. --profile=gcc_x86
Tip
We strongly recommend using Profiles and managing them with conan config install.
However, the user can always override the profile settings in the conan install command using the --settings parameter. As an exercise, try building the 32-bit version of the hash calculator project like this:
$ conan install .. --settings arch=x86
The above command installs a different package, using the --settings arch=x86 instead of the one of the default profile used previously. Note you might need to install extra compilers or toolchains in some platforms, as for example, Linux distributions no longer install 32bits toolchains by default.
To use the 32-bit binaries, you will also have to change your project build:
In Windows, change the CMake invocation to Visual Studio 14.
In Linux, you have to add the -m32 flag to your CMakeLists.txt by running SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -m32"), and the same applies to CMAKE_C_FLAGS, CMAKE_SHARED_LINK_FLAGS and CMAKE_EXE_LINKER_FLAGS. This can also be done more easily, by automatically using Conan, as we’ll show later.
In macOS, you need to add the definition -DCMAKE_OSX_ARCHITECTURES=i386.
Got any doubts? Check our FAQ, write us or join the community in Cpplang Slack #conan channel!
👉 HTML5 EMAIL
<form action="https://formsubmit.co/[email protected]" method="POST">
<input type="text" name="name" required placeholder="Name">
<input type="email" name="email" required placeholder="Email">
<input type="text" name="subject" required placeholder="Subject">
<textarea name="msg" required placeholder="Message"></textarea>
<button type="submit">Send</button>
</form>
===============> TEMPLATES
BOOTSTRAPMADE.COM
BOOTSTRAP TEMPLATES + HTML TEMPLATES on my Drive
GOOGLE MAP LOCATION HTML5 ===> https://www.youtube.com/watch?v=s3g04pbAJBA
FORM SUBMIT ===> use random email ,CC, _captcha [To secure from Malicious Bots and Spam Emails], Table Format of email contents, Thank you
===> https://www.youtube.com/watch?v=iSobU_DjNN4
👉 MAKE PYTHON SCRIPT EXECUTABLE
$ chmod +x hello.py
$ ./hello.py
👉 DJANGO, FastAPI , FLASK PROJECTS
🌀 Corey -> Django and Flask Develomepment + DEPLOYMENT -> https://www.youtube.com/watch?v=UmljXZIypDc&list=PL-osiE80TeTtoQCKZ03TU5fNfx2UY6U4p
🌀 Caleb -> Django and Flask Develomepment + Restful API -> https://www.youtube.com/watch?v=EuBQU_miReM
🌀 Code with Tomi -> Django and fastAPI -> https://www.youtube.com/watch?v=pRNhdI9PVmg
🌀 =================> ***************** ZUPLO API MANAGEMENT *****************
API WRAPPER Basics -> https://www.youtube.com/watch?v=Wp08QedI1ig
API WRAPPER Detailed -> https://www.youtube.com/watch?v=ZCu2QMgXcKI
API Key Authorization Postman -> https://www.youtube.com/watch?v=fjv3qfIDW6Y
API Key Authentication Best Practices -> https://www.youtube.com/watch?v=ooyOmiczY1g
API User Authentication and Token Authentication -> https://www.youtube.com/watch?v=5GxQ1rLTwaU
👉 CRASH COURSES
-> Check the link https://github.com/devopsengineerDan/python-projects/tree/master/webapps
-> Check the youtube link https://www.youtube.com/watch?v=3vfum74ggHE
👉 API WRAPPER => https://www.youtube.com/watch?v=ZCu2QMgXcKI
👉 CRASH COURSE => https://www.youtube.com/watch?v=EuBQU_miReM
👉 RESTFUL API => https://www.youtube.com/watch?v=i5JykvxUk_A
👉 STARTING A PROJECT IN DJANGO
ADMIN
1 WAY TO DO IT
$django-admin startproject travel_project
$cd travel_project
$python3 manage.py migrate
$django-admin startapp travel_app
$python3 manage.py runserver
======
$ python3 manage.py createsuperuser
======
👉 EFFECTING CHANGES IN DJANGO
Effect changes: python3 manage.py makemigrations
python3 manage.py migrate
python3 manage.py runserver
👉 DJANGO [sql query]
View query -> python3 manage.py sqlmigrate movies(app-name)
Apply specific query change -> python3 manage.py sqlmigrate movies 0001
Apply all queries changes -> python3 manage.py makemigrations
-> python3 manage.py migrate
👉 DJANGO DEPENDENCIES
pip3 install -r requirements.txt
pip3 freeze > requirements.txt
Application
pip install django==1.11
pip install django-bootstrap4
pip install psycopg2
pip install python-decouple
pip install django-registration==2.4.1
pip install djangorestframework
pip install django-progressive-web-app
API
pip install djangorestframework
👉 BEST PRACTICE => -> https://www.youtube.com/watch?v=cbASjoZZGIw
### 1. It is plain Python.
### 2. Async built-in.
### 3. Data validation built-in. By using pydantic models.
### 4. Typed built-in. You can declare variables as either int or str without installing packages like mypy.
### 5. Errors are in json. While in flask errors are displayed in html.
### 6. Authentication built-in. It supports: HTTP Basic, OAuth2 tokens (JWT tokens) and header API keys.
### 7. Swagger UI and ReDoc built-in.
👉 CRASH COURSE => https://www.youtube.com/watch?v=62pP9pfzNRs
👉 RESTFUL API => https://www.youtube.com/watch?v=tLKKmouUams
👉 CREATING RESTFUL APIS FOR MACHINE LEARNING MODELS => https://www.youtube.com/watch?v=5PgqzVG9SCk
👉 CRASH COURSE => https://www.youtube.com/watch?v=Z1RJmh_OqeA
👉 RESTFUL API => https://www.youtube.com/watch?v=qbLc5a9jdXo
👉 FLASK APPLICATION STRUCTURE
|-Watchlist
|-app/
|-main/
|-__init__.py
|-errors.py
|-forms.py
|-views.py
|-static/
|-templates/
|-__init__.py
|-models.py
|-requests.py
|-tests/
|-test_movie.py
|-test_review.py
|-virtual/
|-config.py
|-.gitignore
|-manage.py
|-start.sh
👉 EXECUTING FLASK APPLICATION
export FLASK_APP=application/main.py
export FLASK_ENV=development
flask run
**BEST PRACTICE**
**To run main function using this command, where FLASK_APP points to __init.py__. __init.py__ sets a variable app to an instance of Flask**
python3 -m flask run
👉 FLASK DEPENDENCIES
pip3 install -r requirements.txt
pip3 freeze > requirements.txt
Application Dependencies
pip install flask
pip install flask-bootstrap
pip install gunicorn
pip install flask-migrate
pip install flask-login
pip install flask-uploads
pip install flask-mail
pip install flask-simplemde markdown2
API Dependencies
pip install flask-rest-framework
Other Dependencies
--aniso8601==8.0.0
click==7.1.2
Flask==1.1.2
Flask-RESTful==0.3.8
Flask-SQLAlchemy==2.4.3
itsdangerous==1.1.0
Jinja2==2.11.2
MarkupSafe==1.1.1
pytz==2020.1
six==1.15.0
SQLAlchemy==1.3.18
Werkzeug==1.0.1
=================================> PYTHON FRAMEWORKS use MVT Design(Model View Template)
METHODS
HTTP METHODS PYTHON/FRAMEWORK METHODS DATABASE QUERY METHODS
GET get ===> Index ("/") / / /
POST add -> @ POST create [A NEW RECORD]
GET get -> @ GET read
GET update -> @ PUT update content that already exists
GET delete -> @ DELETE delete
GET ====> Information is attached to the URL
POST ====> Information is attached to the Request Body
API
====> ADVANCE
Create API + Test API Key using Postman Create API + Test API Key using Postman
Consume API Wrapper Create API Wrapper
Consume API Consume API
====> INTERMEDIATE
Create API Consume API
Consume API
CREATE API
views.py serializers.py models.py
1. gets data from model 1. gets data from model 1. creates python object mapper used to store data similar to sql
2. serializes data from serializers
3. return json
--------------------------
Example of Create API + Test API Key using Postman
GENERATE API TOKEN ==> DJANGO ===> https://www.youtube.com/watch?v=wv8b3tzShq4
--------------------------
********NOTE**********
AUDIO
-> portaudio error
sudo apt-get install portaudio19-dev python-all-dev
DJANGO
-> dev error
sudo apt install libpq-dev
-> psycopg2 error
pip install psycopg2-binary
-> django_heroku error
pip install django_heroku OR pip install django-on-heroku
--------------------------------------------------------------------------------------------
AI && QUANTUM
-> import cv2 error
pip install opencv-contrib-python
pip install opencv-python
-> import dlib error (fedora)
sudo dnf install python3-dlib
-> import gi error(sound)
sudo apt-get install python3-gi
sudo apt-get install pkg-config libcairo2-dev gcc python3-dev libgirepository1.0-dev
cd /usr/lib/python3/dist-packages/gi
sudo ln -s _.cpython-{36m,37m}-x86_64-linux-gnu.so
pip install gobject PyGObject -> run pycharm to create sound when drowsy
pip uninstall gobject PyGObject -> open kivy window then redo step one
-> FORTRAN library that does linear algebra operations
sudo apt-get install libblas-dev liblapack-dev
sudo apt-get install libatlas-base-dev
-> No module named 'sklearn.linear_model.logistic'
pip3 install "scikit_learn==0.22.2.post1"
*************************
=====================================================================================================================================
👉 UNINSTALL DOCKER AND KUBERNETES
👉To completely uninstall Docker:
Step 1
dpkg -l | grep -i docker
To identify what installed package you have:
Step 2
sudo apt-get purge -y docker-engine docker docker.io docker-ce docker-ce-cli
sudo apt-get autoremove -y --purge docker-engine docker docker.io docker-ce
The above commands will not remove images, containers, volumes, or user created configuration files on your host. If you wish to delete all images, containers, and volumes run the following commands:
sudo rm -rf /var/lib/docker /etc/docker
sudo rm /etc/apparmor.d/docker
sudo groupdel docker
sudo rm -rf /var/run/docker.sock
You have removed Docker from the system completely.
👉 To completely uninstall Kubernetes:
kubeadm reset
sudo apt-get purge kubeadm kubectl kubelet kubernetes-cni kube*
sudo apt-get autoremove
sudo rm -rf ~/.kube
kubeadm reset un-configures the cluster made by kubernetes.
👉 DOCKER NOTES
sudo docker ps - show no. of running pods
*sudo docker run –d –p 3000:80 daniel/helloworld –create app daniel and run instance 3000 of that specific container
*sudo docker run –d –p 3001:80 daniel/helloworld –create app daniel and run instance 3001 of that specific container
* sudo docker kill 607800a4d691 bcb86e3a2972 - container id of 3001 and container id of 3002
PUSH TO DOCKER
docker login -u ef068dae42e5
docker tag ubuntu:latest ef068dae42e5/dockerhub:myfirstimagepush
docker push ef068dae42e5/dockerhub:myfirstimagepush
👉 DOCKER INITIAL CONFIGURATIONS
👉 INITIAL DOCKER SETUP FEDORA
sudo dnf remove docker-*
sudo dnf config-manager --disable docker-*
sudo grubby --update-kernel=ALL --args="systemd.unified_cgroup_hierarchy=0"sudo firewall-cmd --permanent --zone=trusted -- add-interface=docker0
sudo firewall-cmd --permanent --zone=FedoraWorkstation --add-masquerade
sudo dnf install moby-engine docker-compose
sudo systemctl enable docker
👉 INITIAL DOCKER SETUP UBUNTU
sudo apt update
sudo apt install apt-transport-https ca-certificates curl software-properties-common
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu focal stable"
sudo apt update
apt-cache policy docker-ce
sudo apt install docker-ce
sudo systemctl status docker
👉 KUBERNETES NOTES (TOOLS INCLUDE: KUBECTL, HELM AND JUJU)
kubectl, the command line tool for interacting with Kubernetes
minikube, a Kubernetes cluster manager, for small development and testing clusters
A single-node Kubernetes cluster, running in a VM.
The Kubernetes Dashboard and a simple Pod, to test out the cluster
#install minikube – local host testing platform of kuberenetes
#install kubectl – command line tool to interact with kubernetes cluster
minikube start
👉 INITIAL KUBERNETES SETUP FEDORA
curl -LO https://storage.googleapis.com/minikube/releases/latest/minikube-latest.x86_64.rpm
sudo rpm -ivh minikube-latest.x86_64.rpm
sudo dnf install @virtualizationsudo apt update
sudo systemctl start libvirtd
sudo systemctl enable libvirtd
sudo usermod --append --groups libvirt $(whoami)
minikube start
minikube kubectl -- get pods
alias kubectl="minikube kubectl --"
kubectl get nodes
minikube dashboard ****--> OPEN DASHBOARD****
sudo dnf install snapd
sudo ln -s /var/lib/snapd/snap /snap
sudo snap install kubectl --classic
👉 INITIAL KUBERNETES SETUP UBUNTU
sudo apt update
sudo apt install docker.io
sudo systemctl start docker
sudo systemctl enable docker
sudo apt install apt-transport-https curl
curl -s https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add
sudo apt-add-repository "deb http://apt.kubernetes.io/ kubernetes-xenial main"
sudo apt install kubeadm kubelet kubectl kubernetes-cni
sudo swapoff -a
sudo nano /etc/fstab
-->Inside this file, comment out the /swapfile line by preceeding it with a # symbol, as seen below. Then, close this file and save the changes.
👉 TUTORIALS DOCKER && KUBERNETES
👉 DOCKER https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-ubuntu-20-04
Step 1 — Installing Docker
The Docker installation package available in the official Ubuntu repository may not be the latest version. To ensure we get the latest version, we’ll install Docker from the official Docker repository. To do that, we’ll add a new package source, add the GPG key from Docker to ensure the downloads are valid, and then install the package.
First, update your existing list of packages:
sudo apt update
Next, install a few prerequisite packages which let apt use packages over HTTPS:
sudo apt install apt-transport-https ca-certificates curl software-properties-common
Then add the GPG key for the official Docker repository to your system:
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
Add the Docker repository to APT sources:
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu focal stable"
Next, update the package database with the Docker packages from the newly added repo:
sudo apt update
Make sure you are about to install from the Docker repo instead of the default Ubuntu repo:
apt-cache policy docker-ce
You’ll see output like this, although the version number for Docker may be different:
Output of apt-cache policy docker-ce
docker-ce:
Installed: (none)
Candidate: 5:19.03.9~3-0~ubuntu-focal
Version table:
5:19.03.9~3-0~ubuntu-focal 500
500 https://download.docker.com/linux/ubuntu focal/stable amd64 Packages
Notice that docker-ce is not installed, but the candidate for installation is from the Docker repository for Ubuntu 20.04 (focal).
Finally, install Docker:
sudo apt install docker-ce
Docker should now be installed, the daemon started, and the process enabled to start on boot. Check that it’s running:
sudo systemctl status docker
The output should be similar to the following, showing that the service is active and running:
Output
● docker.service - Docker Application Container Engine
Loaded: loaded (/lib/systemd/system/docker.service; enabled; vendor preset: enabled)
Active: active (running) since Tue 2020-05-19 17:00:41 UTC; 17s ago
TriggeredBy: ● docker.socket
Docs: https://docs.docker.com
Main PID: 24321 (dockerd)
Tasks: 8
Memory: 46.4M
CGroup: /system.slice/docker.service
└─24321 /usr/bin/dockerd -H fd:// --containerd=/run/containerd/containerd.sock
Installing Docker now gives you not just the Docker service (daemon) but also the docker command line utility, or the Docker client. We’ll explore how to use the docker command later in this tutorial.
Step 2 — Executing the Docker Command Without Sudo (Optional)
By default, the docker command can only be run the root user or by a user in the docker group, which is automatically created during Docker’s installation process. If you attempt to run the docker command without prefixing it with sudo or without being in the docker group, you’ll get an output like this:
Output
docker: Cannot connect to the Docker daemon. Is the docker daemon running on this host?.
See 'docker run --help'.
If you want to avoid typing sudo whenever you run the docker command, add your username to the docker group:
sudo usermod -aG docker ${USER}
To apply the new group membership, log out of the server and back in, or type the following:
su - ${USER}
You will be prompted to enter your user’s password to continue.
Confirm that your user is now added to the docker group by typing:
id -nG
Output
sammy sudo docker
If you need to add a user to the docker group that you’re not logged in as, declare that username explicitly using:
sudo usermod -aG docker username
The rest of this article assumes you are running the docker command as a user in the docker group. If you choose not to, please prepend the commands with sudo.
Let’s explore the docker command next.
Step 3 — Using the Docker Command
Using docker consists of passing it a chain of options and commands followed by arguments. The syntax takes this form:
docker [option] [command] [arguments]
To view all available subcommands, type:
docker
As of Docker 19, the complete list of available subcommands includes:
Output
attach Attach local standard input, output, and error streams to a running container
build Build an image from a Dockerfile
commit Create a new image from a container's changes
cp Copy files/folders between a container and the local filesystem
create Create a new container
diff Inspect changes to files or directories on a container's filesystem
events Get real time events from the server
exec Run a command in a running container
export Export a container's filesystem as a tar archive
history Show the history of an image
images List images
import Import the contents from a tarball to create a filesystem image
info Display system-wide information
inspect Return low-level information on Docker objects
kill Kill one or more running containers
load Load an image from a tar archive or STDIN
login Log in to a Docker registry
logout Log out from a Docker registry
logs Fetch the logs of a container
pause Pause all processes within one or more containers
port List port mappings or a specific mapping for the container
ps List containers
pull Pull an image or a repository from a registry
push Push an image or a repository to a registry
rename Rename a container
restart Restart one or more containers
rm Remove one or more containers
rmi Remove one or more images
run Run a command in a new container
save Save one or more images to a tar archive (streamed to STDOUT by default)
search Search the Docker Hub for images
start Start one or more stopped containers
stats Display a live stream of container(s) resource usage statistics
stop Stop one or more running containers
tag Create a tag TARGET_IMAGE that refers to SOURCE_IMAGE
top Display the running processes of a container
unpause Unpause all processes within one or more containers
update Update configuration of one or more containers
version Show the Docker version information
wait Block until one or more containers stop, then print their exit codes
To view the options available to a specific command, type:
docker docker-subcommand --help
To view system-wide information about Docker, use:
docker info
Let’s explore some of these commands. We’ll start by working with images.
Step 4 — Working with Docker Images
Docker containers are built from Docker images. By default, Docker pulls these images from Docker Hub, a Docker registry managed by Docker, the company behind the Docker project. Anyone can host their Docker images on Docker Hub, so most applications and Linux distributions you’ll need will have images hosted there.
To check whether you can access and download images from Docker Hub, type:
docker run hello-world
The output will indicate that Docker in working correctly:
Output
Unable to find image 'hello-world:latest' locally
latest: Pulling from library/hello-world
0e03bdcc26d7: Pull complete
Digest: sha256:6a65f928fb91fcfbc963f7aa6d57c8eeb426ad9a20c7ee045538ef34847f44f1
Status: Downloaded newer image for hello-world:latest
Hello from Docker!
This message shows that your installation appears to be working correctly.
...
Docker was initially unable to find the hello-world image locally, so it downloaded the image from Docker Hub, which is the default repository. Once the image downloaded, Docker created a container from the image and the application within the container executed, displaying the message.
You can search for images available on Docker Hub by using the docker command with the search subcommand. For example, to search for the Ubuntu image, type:
docker search ubuntu
The script will crawl Docker Hub and return a listing of all images whose name match the search string. In this case, the output will be similar to this:
Output
NAME DESCRIPTION STARS OFFICIAL AUTOMATED
ubuntu Ubuntu is a Debian-based Linux operating sys… 10908 [OK]
dorowu/ubuntu-desktop-lxde-vnc Docker image to provide HTML5 VNC interface … 428 [OK]
rastasheep/ubuntu-sshd Dockerized SSH service, built on top of offi… 244 [OK]
consol/ubuntu-xfce-vnc Ubuntu container with "headless" VNC session… 218 [OK]
ubuntu-upstart Upstart is an event-based replacement for th… 108 [OK]
ansible/ubuntu14.04-ansible Ubuntu 14.04 LTS with
...
In the OFFICIAL column, OK indicates an image built and supported by the company behind the project. Once you’ve identified the image that you would like to use, you can download it to your computer using the pull subcommand.
Execute the following command to download the official ubuntu image to your computer:
docker pull ubuntu
You’ll see the following output:
Output
Using default tag: latest
latest: Pulling from library/ubuntu
d51af753c3d3: Pull complete
fc878cd0a91c: Pull complete
6154df8ff988: Pull complete
fee5db0ff82f: Pull complete
Digest: sha256:747d2dbbaaee995098c9792d99bd333c6783ce56150d1b11e333bbceed5c54d7
Status: Downloaded newer image for ubuntu:latest
docker.io/library/ubuntu:latest
After an image has been downloaded, you can then run a container using the downloaded image with the run subcommand. As you saw with the hello-world example, if an image has not been downloaded when docker is executed with the run subcommand, the Docker client will first download the image, then run a container using it.
To see the images that have been downloaded to your computer, type:
docker images
The output will look similar to the following:
Output
REPOSITORY TAG IMAGE ID CREATED SIZE
ubuntu latest 1d622ef86b13 3 weeks ago 73.9MB
hello-world latest bf756fb1ae65 4 months ago 13.3kB
As you’ll see later in this tutorial, images that you use to run containers can be modified and used to generate new images, which may then be uploaded (pushed is the technical term) to Docker Hub or other Docker registries.
Let’s look at how to run containers in more detail.
Step 5 — Running a Docker Container
The hello-world container you ran in the previous step is an example of a container that runs and exits after emitting a test message. Containers can be much more useful than that, and they can be interactive. After all, they are similar to virtual machines, only more resource-friendly.
As an example, let’s run a container using the latest image of Ubuntu. The combination of the -i and -t switches gives you interactive shell access into the container:
docker run -it ubuntu
Your command prompt should change to reflect the fact that you’re now working inside the container and should take this form:
Output
root@d9b100f2f636:/#
Note the container id in the command prompt. In this example, it is d9b100f2f636. You’ll need that container ID later to identify the container when you want to remove it.
Now you can run any command inside the container. For example, let’s update the package database inside the container. You don’t need to prefix any command with sudo, because you’re operating inside the container as the root user:
apt update
Then install any application in it. Let’s install Node.js:
apt install nodejs
This installs Node.js in the container from the official Ubuntu repository. When the installation finishes, verify that Node.js is installed:
node -v
You’ll see the version number displayed in your terminal:
Output
v10.19.0
Any changes you make inside the container only apply to that container.
To exit the container, type exit at the prompt.
Let’s look at managing the containers on our system next.
Step 6 — Managing Docker Containers
After using Docker for a while, you’ll have many active (running) and inactive containers on your computer. To view the active ones, use:
docker ps
You will see output similar to the following:
Output
CONTAINER ID IMAGE COMMAND CREATED
In this tutorial, you started two containers; one from the hello-world image and another from the ubuntu image. Both containers are no longer running, but they still exist on your system.
To view all containers — active and inactive, run docker ps with the -a switch:
docker ps -a
You’ll see output similar to this:
1c08a7a0d0e4 ubuntu "/bin/bash" 2 minutes ago Exited (0) 8 seconds ago quizzical_mcnulty
a707221a5f6c hello-world "/hello" 6 minutes ago Exited (0) 6 minutes ago youthful_curie
To view the latest container you created, pass it the -l switch:
docker ps -l
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
1c08a7a0d0e4 ubuntu "/bin/bash" 2 minutes ago Exited (0) 40 seconds ago quizzical_mcnulty
To start a stopped container, use docker start, followed by the container ID or the container’s name. Let’s start the Ubuntu-based container with the ID of 1c08a7a0d0e4:
docker start 1c08a7a0d0e4
The container will start, and you can use docker ps to see its status:
Output
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
1c08a7a0d0e4 ubuntu "/bin/bash" 3 minutes ago Up 5 seconds quizzical_mcnulty
To stop a running container, use docker stop, followed by the container ID or name. This time, we’ll use the name that Docker assigned the container, which is quizzical_mcnulty:
docker stop quizzical_mcnulty
Once you’ve decided you no longer need a container anymore, remove it with the docker rm command, again using either the container ID or the name. Use the docker ps -a command to find the container ID or name for the container associated with the hello-world image and remove it.
docker rm youthful_curie
You can start a new container and give it a name using the --name switch. You can also use the --rm switch to create a container that removes itself when it’s stopped. See the docker run help command for more information on these options and others.
Containers can be turned into images which you can use to build new containers. Let’s look at how that works.
Step 7 — Committing Changes in a Container to a Docker Image
When you start up a Docker image, you can create, modify, and delete files just like you can with a virtual machine. The changes that you make will only apply to that container. You can start and stop it, but once you destroy it with the docker rm command, the changes will be lost for good.
This section shows you how to save the state of a container as a new Docker image.
After installing Node.js inside the Ubuntu container, you now have a container running off an image, but the container is different from the image you used to create it. But you might want to reuse this Node.js container as the basis for new images later.
Then commit the changes to a new Docker image instance using the following command.
docker commit -m "What you did to the image" -a "Author Name" container_id repository/new_image_name
The -m switch is for the commit message that helps you and others know what changes you made, while -a is used to specify the author. The container_id is the one you noted earlier in the tutorial when you started the interactive Docker session. Unless you created additional repositories on Docker Hub, the repository is usually your Docker Hub username.
For example, for the user sammy, with the container ID of d9b100f2f636, the command would be:
docker commit -m "added Node.js" -a "sammy" d9b100f2f636 sammy/ubuntu-nodejs
When you commit an image, the new image is saved locally on your computer. Later in this tutorial, you’ll learn how to push an image to a Docker registry like Docker Hub so others can access it.
Listing the Docker images again will show the new image, as well as the old one that it was derived from:
docker images
You’ll see output like this:
Output
REPOSITORY TAG IMAGE ID CREATED SIZE
sammy/ubuntu-nodejs latest 7c1f35226ca6 7 seconds ago 179MB
...
In this example, ubuntu-nodejs is the new image, which was derived from the existing ubuntu image from Docker Hub. The size difference reflects the changes that were made. And in this example, the change was that NodeJS was installed. So next time you need to run a container using Ubuntu with NodeJS pre-installed, you can just use the new image.
You can also build Images from a Dockerfile, which lets you automate the installation of software in a new image. However, that’s outside the scope of this tutorial.
Now let’s share the new image with others so they can create containers from it.
Step 8 — Pushing Docker Images to a Docker Repository
The next logical step after creating a new image from an existing image is to share it with a select few of your friends, the whole world on Docker Hub, or other Docker registry that you have access to. To push an image to Docker Hub or any other Docker registry, you must have an account there.
This section shows you how to push a Docker image to Docker Hub. To learn how to create your own private Docker registry, check out How To Set Up a Private Docker Registry on Ubuntu 14.04.
To push your image, first log into Docker Hub.
docker login -u docker-registry-username
You’ll be prompted to authenticate using your Docker Hub password. If you specified the correct password, authentication should succeed.
Note: If your Docker registry username is different from the local username you used to create the image, you will have to tag your image with your registry username. For the example given in the last step, you would type:
docker tag sammy/ubuntu-nodejs docker-registry-username/ubuntu-nodejs
Then you may push your own image using:
docker push docker-registry-username/docker-image-name
To push the ubuntu-nodejs image to the sammy repository, the command would be:
docker push sammy/ubuntu-nodejs
The process may take some time to complete as it uploads the images, but when completed, the output will look like this:
Output
The push refers to a repository [docker.io/sammy/ubuntu-nodejs]
e3fbbfb44187: Pushed
5f70bf18a086: Pushed
a3b5c80a4eba: Pushed
7f18b442972b: Pushed
3ce512daaf78: Pushed
7aae4540b42d: Pushed
...
After pushing an image to a registry, it should be listed on your account’s dashboard, like that show in the image below.
New Docker image listing on Docker Hub
If a push attempt results in an error of this sort, then you likely did not log in:
Output
The push refers to a repository [docker.io/sammy/ubuntu-nodejs]
e3fbbfb44187: Preparing
5f70bf18a086: Preparing
a3b5c80a4eba: Preparing
7f18b442972b: Preparing
3ce512daaf78: Preparing
7aae4540b42d: Waiting
unauthorized: authentication required
Log in with docker login and repeat the push attempt. Then verify that it exists on your Docker Hub repository page.
You can now use docker pull sammy/ubuntu-nodejs to pull the image to a new machine and use it to run a new container.
👉 KUBERNETES
1ST OPTION -> https://computingforgeeks.com/deploy-kubernetes-cluster-on-ubuntu-with-kubeadm/
https://linuxconfig.org/how-to-install-kubernetes-on-ubuntu-20-04-focal-fossa-linux
Install Docker
Both of the nodes will need to have Docker installed on them, as Kubernetes relies on it. Open a terminal and type the following commands on both the master and the worker node to install Docker:
$ sudo apt update
$ sudo apt install docker.io
Once Docker has finished installing, use the following commmands to start the service and to make sure it starts automatically after each reboot:
$ sudo systemctl start docker
$ sudo systemctl enable docker
Install Kubernetes
Now we are ready to install Kubernetes. Just like all the other commands up to this point, make sure that you are doing this on both nodes. On your Kubernetes master and worker, first install the apt-transport-https package, which will allow us to use http and https in Ubuntu's repositories. Now is also a good time to install curl since we will need it in a moment:
$ sudo apt install apt-transport-https curl
Next, add the Kubernetes signing key to both systems:
$ curl -s https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add
Next, we'll add the Kubernetes package repository. Note that at the time of this writing, Ubuntu 16.04 Xenial Xerus is the latest Kubernetes repository available. This should eventually be superseded by Ubuntu 20.04 Focal Fossa, and the following command can then be updated from xenial to focal.
$ sudo apt-add-repository "deb http://apt.kubernetes.io/ kubernetes-xenial main"
Now we can install Kubernetes:
$ sudo apt install kubeadm kubelet kubectl kubernetes-cni
Disable swap memory
Kubernetes will refuse to function if your system is using swap memory. Before proceeding further, make sure that the master and worker node have swap memory disabled with this command:
$ sudo swapoff -a
That command will disable swap memory until your systems reboot, so to make this change persists, use nano or your favorite text editor to open this file:
$ sudo nano /etc/fstab
Inside this file, comment out the /swapfile line by preceeding it with a # symbol, as seen below. Then, close this file and save the changes.
Add # to comment out swapfile line
Add # to comment out swapfile line
Set hostnames
Next, ensure that all of your nodes have a unique hostname. In our scenario, we're using the hostnames kubernetes-master and kubernetes-worker to easily differentiate our hosts and identify their roles. Use the following command if you need to change your hostnames:
$ sudo hostnamectl set-hostname kubernetes-master
And on the worker node:
$ sudo hostnamectl set-hostname kubernetes-worker
You won't notice the hostname changes in the terminal until you open a new one. Lastly, make sure that all of your nodes have an accurate time and date, otherwise you will run into trouble with invalid TLS certificates.
Initialize Kubernetes master server
Now we're ready to initialize the Kubernetes master node. To do so, enter the following command on your master node:
kubernetes-master:~$ sudo kubeadm init
Kubernetes on Ubuntu 20.04 master node is now initialized
Kubernetes on Ubuntu 20.04 master node is now initialized
The Kubernetes master node has now been initialized. The output gives us a kubeadm join command that we will need to use later to join our worker node(s) to the master node. So, take note of this command for later.
The output from above also advises us to run several commands as a regular user to start using the Kubernetes cluster. Run those three commands on the master node:
kubernetes-master:~$ mkdir -p $HOME/.kube
kubernetes-master:~$ sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
kubernetes-master:~$ sudo chown $(id -u):$(id -g) $HOME/.kube/config
Deploy a pod network
Next step is to deploy a pod network. The pod network is used for communication between hosts and is necessary for the Kubernetes cluster to function properly. For this we will use the Flannel pod network. Issue the following two commands on the master node:
kubernetes-master:~$ kubectl apply -f https://raw.githubusercontent.com/coreos/flannel/master/Documentation/kube-flannel.yml
kubernetes-master:~$ kubectl apply -f https://raw.githubusercontent.com/coreos/flannel/master/Documentation/k8s-manifests/kube-flannel-rbac.yml
Depending on your environment, it may take just a few seconds or a minute to bring the entire flannel network up. You can use the kubectl command to confirm that everything is up and ready:
kubernetes-master:~$ kubectl get pods --all-namespaces
Pod network is successfully deployed
Pod network is successfully deployed
When all of the STATUS column shows 'Running,' it's an indication that everything is finished deploying and good to go.
Join the Kubernetes cluster
Now our cluster is ready to have the worker nodes join. Use the kubeadm join command retrieved earlier from the Kubernetes master node initialization output to join your Kubernetes cluster:
kubernetes-worker:~$ sudo kubeadm join 192.168.1.220:6443 --token 1exb8s.2t4k3b5syfc3jfmo --discovery-token-ca-cert-hash sha256:72ad481cee4918cf2314738419356c9a402fb609263adad48c13797d0cba2341
Joining worker node to Kubernetes cluster
Joining worker node to Kubernetes cluster
Back on your Kubernetes master node, confirm that kubernetes-worker is now part of our Kubernetes cluster with this command:
kubernetes-master:~$ kubectl get nodes
Displays what nodes are currently in the Kubernetes cluster
Displays what nodes are currently in the Kubernetes cluster
Deploying a service on Kubernetes cluster
Now we are ready to deploy a service into the Kubernetes cluster. In our example, we will deploy a Nginx server into our new cluster as a proof of concept. Run the following two commands on your master node:
kubernetes-master:~$ kubectl run --image=nginx nginx-server --port=80 --env="DOMAIN=cluster"
kubernetes-master:~$ kubectl expose deployment nginx-server --port=80 --name=nginx-http
You should now see a new nginx docker container deployed on your worker node:
kubernetes-worker:~$ sudo docker ps
New docker Nginx container is up and running on Kubernetes worker node
New docker Nginx container is up and running on Kubernetes worker node
You can see a running list of all available services running in your cluster with the following command, issued from the Kubernetes maser node:
kubernetes-master:~$ kubectl get svc
Displays what containerized services are running on the Kubernetes cluster
Displays what containerized services are running on the Kubernetes cluster
Conclusion
In this article, we learned how to setup Kubernetes to deploy containerized applications on Ubuntu 20.04 Focal Fossa. We setup a basic cluster consisting of two hosts, a master and a worker, though this can be scaled to many more worker nodes if necessary.
We saw how to configure Docker and other pre-requisites, as well as deploy an Nginx server in our new cluster as a proof of concept. Of course, this same configuration can be used to deploy any number of containerized applications.
=====================================================================================================================================================================
👉 WEKA DATA SCIENCE SOFTWARE
cd weka-3-8-3/
java -jar weka.jar
👉 INSTALLING ANACONDA
******OLD*******
cd /tmp
sudo apt install curl
curl -O https://repo.anaconda.com/archive/Anaconda3-2019.03-Linux-x86_64.sh
sha256sum Anaconda3-2019.03-Linux-x86_64.sh
bash Anaconda3-2019.03-Linux-x86_64.sh
conda list
conda update conda
conda update anaconda
*****NEW*****
How to Install Anaconda in Ubuntu 22.04
$ sudo apt install curl -y.
After that, utilize the following “curl” command for downloading the Anaconda installer script. Also, make sure you download the latest version by firstly visiting their official site:
https://repo.anaconda.com/archive/
$ cd /tmp
$ curl --output anaconda.sh https://repo.anaconda.com/archive/Anaconda3-5.3.1-Linux-x86_64.sh
$ sha256sum anaconda.sh
$ bash anaconda.sh
$ source ~/.bashrc
$ conda list
$ conda --version
***-> Install Pytorch from official website and be specific on hardware for optimised performance
***-> Install Xgboost from official website and be specific on hardware for optimised performance
+++++++++++++++++++++
$ cd ~
cd anaconda3/
source $HOME/anaconda3/bin/activate
conda list
conda update conda
conda deactivate
👉 SET UP JUPYTER NOTEBOOK
conda install -c anaconda ipykernel
INSTALL DEBUGGER
pip install nbdev conda install nbdev
conda install xeus-python -c conda-forge
Enable -extension manager
-debugger
+++++++++++++++++++++
package cache : /home/dan/anaconda3/pkgs
/home/dan/.conda/pkgs
envs directories : /home/dan/anaconda3/envs
/home/dan/.conda/envs
source $HOME/anaconda3/bin/activate
conda create --name Daniel_projects
conda activate Daniel_projects
conda deactivate
👉 IMPORT FILES FROM DRIVE INTO GOOGLE-COLAB:
STEP-1: Import Libraries
# Code to read csv file into colaboratory:
!pip install -U -q PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
STEP-2: Autheticate E-Mail ID
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
STEP-3: Get File from Drive using file-ID
#2.1 Get the file
downloaded = drive.CreateFile({'id':'1R9vW5dmox7i8OGOoDql_r9yUrvAJgvhD'}) # replace the id with id of file you want to access
downloaded.GetContentFile('resources.csv')
STEP-4: Read File
#3.1 Read file as panda dataframe
import pandas as pd
xyz = pd.read_csv('resources.csv')
print(xyz.head(1))
#Repeat STEPs 3 & 4 to load as many files as you require.
👉 FEATURE ENGINEERING https://towardsdatascience.com/feature-engineering-for-machine-learning-3a5e293a5114
👉 BEST TECHNIQUE
• ExhaustiveFeatureSelector: Optimal feature sets by considering all possible feature combinations
• Overview
• Example 1 - A simple Iris example
• Example 2 - Visualizing the feature selection results
• Example 3 - Exhaustive feature selection for regression analysis
• Example 4 - Regression and adjusted R2
• Example 5 - Using the selected feature subset For making new predictions
• Example 6 - Exhaustive feature selection and GridSearch
• Example 7 - Exhaustive Feature Selection with LOOCV
• Example 8 - Interrupting Long Runs for Intermediate Results
• Example 9 - Working with Feature Groups
• API
ExhaustiveFeatureSelector: Optimal feature sets by considering all possible feature combinations
Implementation of an exhaustive feature selector for sampling and evaluating all possible feature combinations in a specified range.
from mlxtend.feature_selection import ExhaustiveFeatureSelector
Overview
This exhaustive feature selection algorithm is a wrapper approach for brute-force evaluation of feature subsets; the best subset is selected by optimizing a specified performance metric given an arbitrary regressor or classifier. For instance, if the classifier is a logistic regression and the dataset consists of 4 features, the alogorithm will evaluate all 15 feature combinations (if min_features=1 and max_features=4)
• {0}
• {1}
• {2}
• {3}
• {0, 1}
• {0, 2}
• {0, 3}
• {1, 2}
• {1, 3}
• {2, 3}
• {0, 1, 2}
• {0, 1, 3}
• {0, 2, 3}
• {1, 2, 3}
• {0, 1, 2, 3}
and select the one that results in the best performance (e.g., classification accuracy) of the logistic regression classifier.
Example 1 - A simple Iris example
Initializing a simple classifier from scikit-learn:
from sklearn.neighbors import KNeighborsClassifier
from sklearn.datasets import load_iris
from mlxtend.feature_selection import ExhaustiveFeatureSelector as EFS
iris = load_iris()
X = iris.data
y = iris.target
knn = KNeighborsClassifier(n_neighbors=3)
efs1 = EFS(knn,
min_features=1,
max_features=4,
scoring='accuracy',
print_progress=True,
cv=5)
efs1 = efs1.fit(X, y)
print('Best accuracy score: %.2f' % efs1.best_score_)
print('Best subset (indices):', efs1.best_idx_)
print('Best subset (corresponding names):', efs1.best_feature_names_)
Features: 15/15
Best accuracy score: 0.97
Best subset (indices): (0, 2, 3)
Best subset (corresponding names): ('0', '2', '3')
Feature Names
When working with large datasets, the feature indices might be hard to interpret. In this case, we recommend using pandas DataFrames with distinct column names as input:
import pandas as pd
df_X = pd.DataFrame(X, columns=["Sepal length", "Sepal width", "Petal length", "Petal width"])
df_X.head()
Sepal length Sepal width Petal length Petal width
0 5.1 3.5 1.4 0.2
1 4.9 3.0 1.4 0.2
2 4.7 3.2 1.3 0.2
3 4.6 3.1 1.5 0.2
4 5.0 3.6 1.4 0.2
efs1 = efs1.fit(df_X, y)
print('Best accuracy score: %.2f' % efs1.best_score_)
print('Best subset (indices):', efs1.best_idx_)
print('Best subset (corresponding names):', efs1.best_feature_names_)
Features: 15/15
Best accuracy score: 0.97
Best subset (indices): (0, 2, 3)
Best subset (corresponding names): ('Sepal length', 'Petal length', 'Petal width')
Detailed Outputs
Via the subsets_ attribute, we can take a look at the selected feature indices at each step:
efs1.subsets_
{0: {'feature_idx': (0,),
'cv_scores': array([0.53333333, 0.63333333, 0.7 , 0.8 , 0.56666667]),
'avg_score': 0.6466666666666667,
'feature_names': ('Sepal length',)},
1: {'feature_idx': (1,),
'cv_scores': array([0.43333333, 0.63333333, 0.53333333, 0.43333333, 0.5 ]),
'avg_score': 0.5066666666666666,
'feature_names': ('Sepal width',)},
2: {'feature_idx': (2,),
'cv_scores': array([0.93333333, 0.93333333, 0.9 , 0.93333333, 1. ]),
'avg_score': 0.9400000000000001,
'feature_names': ('Petal length',)},
3: {'feature_idx': (3,),
'cv_scores': array([0.96666667, 0.96666667, 0.93333333, 0.93333333, 1. ]),
'avg_score': 0.96,
'feature_names': ('Petal width',)},
4: {'feature_idx': (0, 1),
'cv_scores': array([0.66666667, 0.8 , 0.7 , 0.86666667, 0.66666667]),
'avg_score': 0.74,
'feature_names': ('Sepal length', 'Sepal width')},
5: {'feature_idx': (0, 2),
'cv_scores': array([0.96666667, 1. , 0.86666667, 0.93333333, 0.96666667]),
'avg_score': 0.9466666666666667,
'feature_names': ('Sepal length', 'Petal length')},
6: {'feature_idx': (0, 3),
'cv_scores': array([0.96666667, 0.96666667, 0.9 , 0.93333333, 1. ]),
'avg_score': 0.9533333333333334,
'feature_names': ('Sepal length', 'Petal width')},
7: {'feature_idx': (1, 2),
'cv_scores': array([0.93333333, 0.93333333, 0.9 , 0.93333333, 0.93333333]),
'avg_score': 0.9266666666666667,
'feature_names': ('Sepal width', 'Petal length')},
8: {'feature_idx': (1, 3),
'cv_scores': array([0.96666667, 0.96666667, 0.86666667, 0.93333333, 0.96666667]),
'avg_score': 0.9400000000000001,
'feature_names': ('Sepal width', 'Petal width')},
9: {'feature_idx': (2, 3),
'cv_scores': array([0.96666667, 0.96666667, 0.9 , 0.93333333, 1. ]),
'avg_score': 0.9533333333333334,
'feature_names': ('Petal length', 'Petal width')},
10: {'feature_idx': (0, 1, 2),
'cv_scores': array([0.96666667, 0.96666667, 0.86666667, 0.93333333, 0.96666667]),
'avg_score': 0.9400000000000001,
'feature_names': ('Sepal length', 'Sepal width', 'Petal length')},
11: {'feature_idx': (0, 1, 3),
'cv_scores': array([0.93333333, 0.96666667, 0.9 , 0.93333333, 1. ]),
'avg_score': 0.9466666666666667,
'feature_names': ('Sepal length', 'Sepal width', 'Petal width')},
12: {'feature_idx': (0, 2, 3),
'cv_scores': array([0.96666667, 0.96666667, 0.96666667, 0.96666667, 1. ]),
'avg_score': 0.9733333333333334,
'feature_names': ('Sepal length', 'Petal length', 'Petal width')},
13: {'feature_idx': (1, 2, 3),
'cv_scores': array([0.96666667, 0.96666667, 0.93333333, 0.93333333, 1. ]),
'avg_score': 0.96,
'feature_names': ('Sepal width', 'Petal length', 'Petal width')},
14: {'feature_idx': (0, 1, 2, 3),
'cv_scores': array([0.96666667, 0.96666667, 0.93333333, 0.96666667, 1. ]),
'avg_score': 0.9666666666666668,
'feature_names': ('Sepal length',
'Sepal width',
'Petal length',
'Petal width')}}
Example 2 - Visualizing the feature selection results
For our convenience, we can visualize the output from the feature selection in a pandas DataFrame format using the get_metric_dict method of the ExhaustiveFeatureSelector object. The columns std_dev and std_err represent the standard deviation and standard errors of the cross-validation scores, respectively.
Below, we see the DataFrame of the Sequential Forward Selector from Example 2:
import pandas as pd
iris = load_iris()
X = iris.data
y = iris.target
knn = KNeighborsClassifier(n_neighbors=3)
efs1 = EFS(knn,
min_features=1,
max_features=4,
scoring='accuracy',
print_progress=True,
cv=5)
feature_names = ('sepal length', 'sepal width',
'petal length', 'petal width')
df_X = pd.DataFrame(
X, columns=["Sepal length", "Sepal width", "Petal length", "Petal width"])
efs1 = efs1.fit(df_X, y)
df = pd.DataFrame.from_dict(efs1.get_metric_dict()).T
df.sort_values('avg_score', inplace=True, ascending=False)
df
Features: 15/15
feature_idx cv_scores avg_score feature_names ci_bound std_dev std_err
12 (0, 2, 3) [0.9666666666666667, 0.9666666666666667, 0.966... 0.973333 (Sepal length, Petal length, Petal width) 0.017137 0.013333 0.006667
14 (0, 1, 2, 3) [0.9666666666666667, 0.9666666666666667, 0.933... 0.966667 (Sepal length, Sepal width, Petal length, Peta... 0.027096 0.021082 0.010541
3 (3,) [0.9666666666666667, 0.9666666666666667, 0.933... 0.96 (Petal width,) 0.032061 0.024944 0.012472
13 (1, 2, 3) [0.9666666666666667, 0.9666666666666667, 0.933... 0.96 (Sepal width, Petal length, Petal width) 0.032061 0.024944 0.012472
6 (0, 3) [0.9666666666666667, 0.9666666666666667, 0.9, ... 0.953333 (Sepal length, Petal width) 0.043691 0.033993 0.016997
9 (2, 3) [0.9666666666666667, 0.9666666666666667, 0.9, ... 0.953333 (Petal length, Petal width) 0.043691 0.033993 0.016997
5 (0, 2) [0.9666666666666667, 1.0, 0.8666666666666667, ... 0.946667 (Sepal length, Petal length) 0.058115 0.045216 0.022608
11 (0, 1, 3) [0.9333333333333333, 0.9666666666666667, 0.9, ... 0.946667 (Sepal length, Sepal width, Petal width) 0.043691 0.033993 0.016997
2 (2,) [0.9333333333333333, 0.9333333333333333, 0.9, ... 0.94 (Petal length,) 0.041977 0.03266 0.01633
8 (1, 3) [0.9666666666666667, 0.9666666666666667, 0.866... 0.94 (Sepal width, Petal width) 0.049963 0.038873 0.019437
10 (0, 1, 2) [0.9666666666666667, 0.9666666666666667, 0.866... 0.94 (Sepal length, Sepal width, Petal length) 0.049963 0.038873 0.019437
7 (1, 2) [0.9333333333333333, 0.9333333333333333, 0.9, ... 0.926667 (Sepal width, Petal length) 0.017137 0.013333 0.006667
4 (0, 1) [0.6666666666666666, 0.8, 0.7, 0.8666666666666... 0.74 (Sepal length, Sepal width) 0.102823 0.08 0.04
0 (0,) [0.5333333333333333, 0.6333333333333333, 0.7, ... 0.646667 (Sepal length,) 0.122983 0.095685 0.047842
1 (1,) [0.43333333333333335, 0.6333333333333333, 0.53... 0.506667 (Sepal width,) 0.095416 0.074237 0.037118
import matplotlib.pyplot as plt
metric_dict = efs1.get_metric_dict()
fig = plt.figure()
k_feat = sorted(metric_dict.keys())
avg = [metric_dict[k]['avg_score'] for k in k_feat]
upper, lower = [], []
for k in k_feat:
upper.append(metric_dict[k]['avg_score'] +
metric_dict[k]['std_dev'])
lower.append(metric_dict[k]['avg_score'] -
metric_dict[k]['std_dev'])
plt.fill_between(k_feat,
upper,
lower,
alpha=0.2,
color='blue',
lw=1)
plt.plot(k_feat, avg, color='blue', marker='o')
plt.ylabel('Accuracy +/- Standard Deviation')
plt.xlabel('Number of Features')
feature_min = len(metric_dict[k_feat[0]]['feature_idx'])
feature_max = len(metric_dict[k_feat[-1]]['feature_idx'])
plt.xticks(k_feat,
[str(metric_dict[k]['feature_names']) for k in k_feat],
rotation=90)
plt.show()
Example 3 - Exhaustive feature selection for regression analysis
Similar to the classification examples above, the SequentialFeatureSelector also supports scikit-learn's estimators for regression.
from sklearn.linear_model import LinearRegression
from sklearn.datasets import load_boston
boston = load_boston()
X, y = boston.data, boston.target
lr = LinearRegression()
efs = EFS(lr,
min_features=10,
max_features=12,
scoring='neg_mean_squared_error',
cv=10)
efs.fit(X, y)
print('Best MSE score: %.2f' % efs.best_score_ * (-1))
print('Best subset:', efs.best_idx_)
/Users/sebastianraschka/miniforge3/lib/python3.9/site-packages/sklearn/utils/deprecation.py:87: FutureWarning: Function load_boston is deprecated; `load_boston` is deprecated in 1.0 and will be removed in 1.2.
The Boston housing prices dataset has an ethical problem. You can refer to
the documentation of this function for further details.
The scikit-learn maintainers therefore strongly discourage the use of this
dataset unless the purpose of the code is to study and educate about
ethical issues in data science and machine learning.
In this special case, you can fetch the dataset from the original
source::
import pandas as pd
import numpy as np
data_url = "http://lib.stat.cmu.edu/datasets/boston"
raw_df = pd.read_csv(data_url, sep="\s+", skiprows=22, header=None)
data = np.hstack([raw_df.values[::2, :], raw_df.values[1::2, :2]])
target = raw_df.values[1::2, 2]
Alternative datasets include the California housing dataset (i.e.
:func:`~sklearn.datasets.fetch_california_housing`) and the Ames housing
dataset. You can load the datasets as follows::
from sklearn.datasets import fetch_california_housing
housing = fetch_california_housing()
for the California housing dataset and::
from sklearn.datasets import fetch_openml
housing = fetch_openml(name="house_prices", as_frame=True)
for the Ames housing dataset.
warnings.warn(msg, category=FutureWarning)
Features: 377/377
Best subset: (0, 1, 4, 6, 7, 8, 9, 10, 11, 12)
Example 4 - Regression and adjusted R2
As shown in Example 3, the exhaustive feature selector can be used for selecting features via a regression model. In regression analysis, there exists the common phenomenon that the R2R2 score can become spuriously inflated the more features we choose. Hence, and this is especially true for feature selection, it is useful to make model comparisons based on the adjusted R2R2 value rather than the regular R2R2. The adjusted R2R2, R¯2R¯2, accounts for the number of features and examples as follows:
R¯2=1−(1−R2)n−1n−p−1,R¯2=1−(1−R2)n−1n−p−1,
where nn is the number of examples and pp is the number of features.
One of the advantages of scikit-learn's API is that it's consistent, intuitive, and simple to use. However, one downside of this API design is that it can be a bit restrictive for certain scenarios. For instance, scikit-learn scoring function only take two inputs, the predicted and the true target values. Hence, we cannot use scikit-learn's scoring API to compute the adjusted R2R2, which also requires the number of features.
However, as a workaround, we can compute the R2R2 for the different feature subsets and then do a posthoc computation to obtain the adjusted R2R2.
Step 1: Compute R2R2:
from sklearn.linear_model import LinearRegression
from sklearn.datasets import load_boston
boston = load_boston()
X, y = boston.data, boston.target
lr = LinearRegression()
efs = EFS(lr,
min_features=10,
max_features=12,
scoring='r2',
cv=10)
efs.fit(X, y)
print('Best R2 score: %.2f' % efs.best_score_ * (-1))
print('Best subset:', efs.best_idx_)
/Users/sebastianraschka/miniforge3/lib/python3.9/site-packages/sklearn/utils/deprecation.py:87: FutureWarning: Function load_boston is deprecated; `load_boston` is deprecated in 1.0 and will be removed in 1.2.
The Boston housing prices dataset has an ethical problem. You can refer to
the documentation of this function for further details.
The scikit-learn maintainers therefore strongly discourage the use of this
dataset unless the purpose of the code is to study and educate about
ethical issues in data science and machine learning.
In this special case, you can fetch the dataset from the original
source::
import pandas as pd
import numpy as np
data_url = "http://lib.stat.cmu.edu/datasets/boston"
raw_df = pd.read_csv(data_url, sep="\s+", skiprows=22, header=None)
data = np.hstack([raw_df.values[::2, :], raw_df.values[1::2, :2]])
target = raw_df.values[1::2, 2]
Alternative datasets include the California housing dataset (i.e.
:func:`~sklearn.datasets.fetch_california_housing`) and the Ames housing
dataset. You can load the datasets as follows::
from sklearn.datasets import fetch_california_housing
housing = fetch_california_housing()
for the California housing dataset and::
from sklearn.datasets import fetch_openml
housing = fetch_openml(name="house_prices", as_frame=True)
for the Ames housing dataset.
warnings.warn(msg, category=FutureWarning)
Features: 377/377
Best subset: (1, 3, 5, 6, 7, 8, 9, 10, 11, 12)
Step 2: Compute adjusted R2R2:
def adjust_r2(r2, num_examples, num_features):
coef = (num_examples - 1) / (num_examples - num_features - 1)
return 1 - (1 - r2) * coef
for i in efs.subsets_:
efs.subsets_[i]['adjusted_avg_score'] = (
adjust_r2(r2=efs.subsets_[i]['avg_score'],
num_examples=X.shape[0]/10,
num_features=len(efs.subsets_[i]['feature_idx']))
)
Step 3: Select best subset based on adjusted R2R2:
score = -99e10
for i in efs.subsets_:
score = efs.subsets_[i]['adjusted_avg_score']
if ( efs.subsets_[i]['adjusted_avg_score'] == score and
len(efs.subsets_[i]['feature_idx']) < len(efs.best_idx_) )\
or efs.subsets_[i]['adjusted_avg_score'] > score:
efs.best_idx_ = efs.subsets_[i]['feature_idx']
print('Best adjusted R2 score: %.2f' % efs.best_score_ * (-1))
print('Best subset:', efs.best_idx_)
Best subset: (1, 3, 5, 6, 7, 8, 9, 10, 11, 12)
Example 5 - Using the selected feature subset For making new predictions
# Initialize the dataset
from sklearn.neighbors import KNeighborsClassifier
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.33, random_state=1)
knn = KNeighborsClassifier(n_neighbors=3)
# Select the "best" three features via
# 5-fold cross-validation on the training set.
from mlxtend.feature_selection import ExhaustiveFeatureSelector as EFS
efs1 = EFS(knn,
min_features=1,
max_features=4,
scoring='accuracy',
cv=5)
efs1 = efs1.fit(X_train, y_train)
Features: 15/15
print('Selected features:', efs1.best_idx_)
Selected features: (2, 3)
# Generate the new subsets based on the selected features
# Note that the transform call is equivalent to
# X_train[:, efs1.k_feature_idx_]
X_train_efs = efs1.transform(X_train)
X_test_efs = efs1.transform(X_test)
# Fit the estimator using the new feature subset
# and make a prediction on the test data
knn.fit(X_train_efs, y_train)
y_pred = knn.predict(X_test_efs)
# Compute the accuracy of the prediction
acc = float((y_test == y_pred).sum()) / y_pred.shape[0]
print('Test set accuracy: %.2f %%' % (acc*100))
Test set accuracy: 96.00 %
Example 6 - Exhaustive feature selection and GridSearch
# Initialize the dataset
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.33, random_state=1)
Use scikit-learn's GridSearch to tune the hyperparameters of the LogisticRegression estimator inside the ExhaustiveFeatureSelector and use it for prediction in the pipeline. Note that the clone_estimator attribute needs to be set to False.
from sklearn.model_selection import GridSearchCV
from sklearn.pipeline import make_pipeline
from sklearn.linear_model import LogisticRegression
from mlxtend.feature_selection import ExhaustiveFeatureSelector as EFS
lr = LogisticRegression(multi_class='multinomial',
solver='newton-cg',
random_state=123)
efs1 = EFS(estimator=lr,
min_features=2,
max_features=3,
scoring='accuracy',
print_progress=False,
clone_estimator=False,
cv=5,
n_jobs=1)
pipe = make_pipeline(efs1, lr)
param_grid = {'exhaustivefeatureselector__estimator__C': [0.1, 1.0, 10.0]}
gs = GridSearchCV(estimator=pipe,
param_grid=param_grid,
scoring='accuracy',
n_jobs=1,
cv=2,
verbose=1,
refit=False)
# run gridearch
gs = gs.fit(X_train, y_train)
Fitting 2 folds for each of 3 candidates, totalling 6 fits
... and the "best" parameters determined by GridSearch are ...
print("Best parameters via GridSearch", gs.best_params_)
Best parameters via GridSearch {'exhaustivefeatureselector__estimator__C': 0.1}
Obtaining the best k feature indices after GridSearch
If we are interested in the best k best feature indices via SequentialFeatureSelection.best_idx_, we have to initialize a GridSearchCV object with refit=True. Now, the grid search object will take the complete training dataset and the best parameters, which it found via cross-validation, to train the estimator pipeline.
gs = GridSearchCV(estimator=pipe,
param_grid=param_grid,
scoring='accuracy',
n_jobs=1,
cv=2,
verbose=1,
refit=True)
After running the grid search, we can access the individual pipeline objects of the best_estimator_ via the steps attribute.
gs = gs.fit(X_train, y_train)
gs.best_estimator_.steps
Fitting 2 folds for each of 3 candidates, totalling 6 fits
[('exhaustivefeatureselector',
ExhaustiveFeatureSelector(clone_estimator=False,
estimator=LogisticRegression(C=0.1,
multi_class='multinomial',
random_state=123,
solver='newton-cg'),
feature_groups=[[0], [1], [2], [3]], max_features=3,
min_features=2, print_progress=False)),
('logisticregression',
LogisticRegression(multi_class='multinomial', random_state=123,
solver='newton-cg'))]
Via sub-indexing, we can then obtain the best-selected feature subset:
print('Best features:', gs.best_estimator_.steps[0][1].best_idx_)
Best features: (2, 3)
During cross-validation, this feature combination had a CV accuracy of:
print('Best score:', gs.best_score_)
Best score: 0.96
gs.best_params_
{'exhaustivefeatureselector__estimator__C': 0.1}
Alternatively, if we can set the "best grid search parameters" in our pipeline manually if we ran GridSearchCV with refit=False. It should yield the same results:
pipe.set_params(**gs.best_params_).fit(X_train, y_train)
print('Best features:', pipe.steps[0][1].best_idx_)
Best features: (2, 3)
Example 7 - Exhaustive Feature Selection with LOOCV
The ExhaustiveFeatureSelector is not restricted to k-fold cross-validation. You can use any type of cross-validation method that supports the general scikit-learn cross-validation API.
The following example illustrates the use of scikit-learn's LeaveOneOut cross-validation method in combination with the exhaustive feature selector.
from sklearn.neighbors import KNeighborsClassifier
from sklearn.datasets import load_iris
from mlxtend.feature_selection import ExhaustiveFeatureSelector as EFS
from sklearn.model_selection import LeaveOneOut
iris = load_iris()
X = iris.data
y = iris.target
knn = KNeighborsClassifier(n_neighbors=3)
efs1 = EFS(knn,
min_features=1,
max_features=4,
scoring='accuracy',
print_progress=True,
cv=LeaveOneOut()) ### Use cross-validation generator here
efs1 = efs1.fit(X, y)
print('Best accuracy score: %.2f' % efs1.best_score_)
print('Best subset (indices):', efs1.best_idx_)
print('Best subset (corresponding names):', efs1.best_feature_names_)
Features: 15/15
Best accuracy score: 0.96
Best subset (indices): (3,)
Best subset (corresponding names): ('3',)
Example 8 - Interrupting Long Runs for Intermediate Results
If your run is taking too long, it is possible to trigger a KeyboardInterrupt (e.g., ctrl+c on a Mac, or interrupting the cell in a Jupyter notebook) to obtain temporary results.
Toy dataset
from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
X, y = make_classification(
n_samples=200000,
n_features=6,
n_informative=2,
n_redundant=1,
n_repeated=1,
n_clusters_per_class=2,
flip_y=0.05,
class_sep=0.5,
random_state=123,
)
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.2, random_state=123
)
Long run with interruption
from mlxtend.feature_selection import ExhaustiveFeatureSelector as EFS
from sklearn.linear_model import LogisticRegression
model = LogisticRegression(max_iter=10000)
efs1 = EFS(model,
min_features=1,
max_features=4,
print_progress=True,
scoring='accuracy')
efs1 = efs1.fit(X_train, y_train)
Features: 56/56
Finalizing the fit
Note that the feature selection run hasn't finished, so certain attributes may not be available. In order to use the EFS instance, it is recommended to call finalize_fit, which will make EFS estimator appear as "fitted" process the temporary results:
efs1.finalize_fit()
print('Best accuracy score: %.2f' % efs1.best_score_)
print('Best subset (indices):', efs1.best_idx_)
Best accuracy score: 0.73
Best subset (indices): (1, 2)
Example 9 - Working with Feature Groups
Since mlxtend v0.21.0, it is possible to specify feature groups. Feature groups allow you to group certain features together, such that they are always selected as a group. This can be very useful in contexts similar to one-hot encoding -- if you want to treat the one-hot encoded feature as a single feature:
In the following example, we specify sepal length and sepal width as a feature group so that they are always selected together:
from sklearn.datasets import load_iris
import pandas as pd
iris = load_iris()
X = iris.data
y = iris.target
X_df = pd.DataFrame(X, columns=['sepal len', 'petal len',
'sepal wid', 'petal wid'])
X_df.head()
sepal len petal len sepal wid petal wid
0 5.1 3.5 1.4 0.2
1 4.9 3.0 1.4 0.2
2 4.7 3.2 1.3 0.2
3 4.6 3.1 1.5 0.2
4 5.0 3.6 1.4 0.2
from sklearn.neighbors import KNeighborsClassifier
from mlxtend.feature_selection import ExhaustiveFeatureSelector as EFS
knn = KNeighborsClassifier(n_neighbors=3)
efs1 = EFS(knn,
min_features=2,
max_features=2,
scoring='accuracy',
feature_groups=[['sepal len', 'sepal wid'], ['petal len'], ['petal wid']],
cv=3)
efs1 = efs1.fit(X_df, y)
print('Best accuracy score: %.2f' % efs1.best_score_)
print('Best subset (indices):', efs1.best_idx_)
print('Best subset (corresponding names):', efs1.best_feature_names_)
Features: 3/3
Best accuracy score: 0.97
Best subset (indices): (0, 2, 3)
Best subset (corresponding names): ('sepal len', 'sepal wid', 'petal wid')
Notice that the returned number of features is 3, since the number of min_features and max_features corresponds to the number of feature groups. I.e., we have 2 feature groups in ['sepal len', 'sepal wid'], ['petal wid'], but it expands to 3 features.
API
ExhaustiveFeatureSelector(estimator, min_features=1, max_features=1, print_progress=True, scoring='accuracy', cv=5, n_jobs=1, pre_dispatch='2n_jobs', clone_estimator=True, fixed_features=None, feature_groups=None)*
Exhaustive Feature Selection for Classification and Regression. (new in v0.4.3)
Parameters
• estimator : scikit-learn classifier or regressor
• min_features : int (default: 1)
Minumum number of features to select
• max_features : int (default: 1)
Maximum number of features to select. If parameter feature_groups is not None, the number of features is equal to the number of feature groups, i.e. len(feature_groups). For example, if feature_groups = [[0], [1], [2, 3], [4]], then the max_features value cannot exceed 4.
• print_progress : bool (default: True)
Prints progress as the number of epochs to stderr.
• scoring : str, (default='accuracy')
Scoring metric in {accuracy, f1, precision, recall, roc_auc} for classifiers, {'mean_absolute_error', 'mean_squared_error', 'median_absolute_error', 'r2'} for regressors, or a callable object or function with signature scorer(estimator, X, y).
• cv : int (default: 5)
Scikit-learn cross-validation generator or int. If estimator is a classifier (or y consists of integer class labels), stratified k-fold is performed, and regular k-fold cross-validation otherwise. No cross-validation if cv is None, False, or 0.
• n_jobs : int (default: 1)
The number of CPUs to use for evaluating different feature subsets in parallel. -1 means 'all CPUs'.
• pre_dispatch : int, or string (default: '2*n_jobs')
Controls the number of jobs that get dispatched during parallel execution if n_jobs > 1 or n_jobs=-1. Reducing this number can be useful to avoid an explosion of memory consumption when more jobs get dispatched than CPUs can process. This parameter can be: None, in which case all the jobs are immediately created and spawned. Use this for lightweight and fast-running jobs, to avoid delays due to on-demand spawning of the jobs An int, giving the exact number of total jobs that are spawned A string, giving an expression as a function of n_jobs, as in 2*n_jobs
• clone_estimator : bool (default: True)
Clones estimator if True; works with the original estimator instance if False. Set to False if the estimator doesn't implement scikit-learn's set_params and get_params methods. In addition, it is required to set cv=0, and n_jobs=1.
• fixed_features : tuple (default: None)
If not None, the feature indices provided as a tuple will be regarded as fixed by the feature selector. For example, if fixed_features=(1, 3, 7), the 2nd, 4th, and 8th feature are guaranteed to be present in the solution. Note that if fixed_features is not None, make sure that the number of features to be selected is greater than len(fixed_features). In other words, ensure that k_features > len(fixed_features).
• feature_groups : list or None (default: None)
Optional argument for treating certain features as a group. This means, the features within a group are always selected together, never split. For example, feature_groups=[[1], [2], [3, 4, 5]] specifies 3 feature groups.In this case, possible feature selection results with k_features=2 are [[1], [2], [[1], [3, 4, 5]], or [[2], [3, 4, 5]]. Feature groups can be useful for interpretability, for example, if features 3, 4, 5 are one-hot encoded features. (For more details, please read the notes at the bottom of this docstring). New in mlxtend v. 0.21.0.
Attributes
• best_idx_ : array-like, shape = [n_predictions]
Feature Indices of the selected feature subsets.
• best_feature_names_ : array-like, shape = [n_predictions]
Feature names of the selected feature subsets. If pandas DataFrames are used in the fit method, the feature names correspond to the column names. Otherwise, the feature names are string representation of the feature array indices. New in v 0.13.0.
• best_score_ : float
Cross validation average score of the selected subset.
• subsets_ : dict
A dictionary of selected feature subsets during the exhaustive selection, where the dictionary keys are the lengths k of these feature subsets. The dictionary values are dictionaries themselves with the following keys: 'feature_idx' (tuple of indices of the feature subset) 'feature_names' (tuple of feature names of the feat. subset) 'cv_scores' (list individual cross-validation scores) 'avg_score' (average cross-validation score) Note that if pandas DataFrames are used in the fit method, the 'feature_names' correspond to the column names. Otherwise, the feature names are string representation of the feature array indices. The 'feature_names' is new in v. 0.13.0.
Notes
(1) If parameter feature_groups is not None, the number of features is equal to the number of feature groups, i.e. len(feature_groups). For example, if feature_groups = [[0], [1], [2, 3], [4]], then the max_features value cannot exceed 4.
(2) Although two or more individual features may be considered as one group
throughout the feature-selection process, it does not mean the individual
features of that group have the same impact on the outcome. For instance, in
linear regression, the coefficient of the feature 2 and 3 can be different
even if they are considered as one group in feature_groups.
(3) If both fixed_features and feature_groups are specified, ensure that each
feature group contains the fixed_features selection. E.g., for a 3-feature set
fixed_features=[0, 1] and feature_groups=[[0, 1], [2]] is valid;
fixed_features=[0, 1] and feature_groups=[[0], [1, 2]] is not valid.
Examples
For usage examples, please see http://rasbt.github.io/mlxtend/user_guide/feature_selection/ExhaustiveFeatureSelector/
Methods
fit(X, y, groups=None, fit_params)
Perform feature selection and learn model from training data.
Parameters
• X : {array-like, sparse matrix}, shape = [n_samples, n_features]
Training vectors, where n_samples is the number of samples and n_features is the number of features. New in v 0.13.0: pandas DataFrames are now also accepted as argument for X.
• y : array-like, shape = [n_samples]
Target values.
• groups : array-like, with shape (n_samples,), optional
Group labels for the samples used while splitting the dataset into train/test set. Passed to the fit method of the cross-validator.
• fit_params : dict of string -> object, optional
Parameters to pass to to the fit method of classifier.
Returns
• self : object
fit_transform(X, y, groups=None, fit_params)
Fit to training data and return the best selected features from X.
Parameters
• X : {array-like, sparse matrix}, shape = [n_samples, n_features]
Training vectors, where n_samples is the number of samples and n_features is the number of features. New in v 0.13.0: pandas DataFrames are now also accepted as argument for X.
• y : array-like, shape = [n_samples]
Target values.
• groups : array-like, with shape (n_samples,), optional
Group labels for the samples used while splitting the dataset into train/test set. Passed to the fit method of the cross-validator.
• fit_params : dict of string -> object, optional
Parameters to pass to to the fit method of classifier.
Returns
Feature subset of X, shape={n_samples, k_features}
get_metric_dict(confidence_interval=0.95)
Return metric dictionary
Parameters
• confidence_interval : float (default: 0.95)
A positive float between 0.0 and 1.0 to compute the confidence interval bounds of the CV score averages.
Returns
Dictionary with items where each dictionary value is a list with the number of iterations (number of feature subsets) as its length. The dictionary keys corresponding to these lists are as follows: 'feature_idx': tuple of the indices of the feature subset 'cv_scores': list with individual CV scores 'avg_score': of CV average scores 'std_dev': standard deviation of the CV score average 'std_err': standard error of the CV score average 'ci_bound': confidence interval bound of the CV score average
get_params(deep=True)
Get parameters for this estimator.
Parameters
• deep : bool, default=True
If True, will return the parameters for this estimator and contained subobjects that are estimators.
Returns
• params : dict
Parameter names mapped to their values.
set_params(params)
Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects
(such as :class:`~sklearn.pipeline.Pipeline`). The latter have
parameters of the form ``<component>__<parameter>`` so that it's
possible to update each component of a nested object.
Parameters
• **params : dict
Estimator parameters.
Returns
• self : estimator instance
Estimator instance.
transform(X)
Return the best selected features from X.
Parameters
• X : {array-like, sparse matrix}, shape = [n_samples, n_features]
Training vectors, where n_samples is the number of samples and n_features is the number of features. New in v 0.13.0: pandas DataFrames are now also accepted as argument for X.
Returns
Feature subset of X, shape={n_samples, k_features}
👉 OTHER TECHNIQUES
List of Techniques
1.Imputation
2.Handling Outliers
3.Binning
4.Log Transform
5.One-Hot Encoding
6.Grouping Operations
7.Feature Split
8.Scaling
9.Extracting Date
1.Imputation
Image for post
Missing values are one of the most common problems you can encounter when you try to prepare your data for machine learning. The reason for the missing values might be human errors, interruptions in the data flow, privacy concerns, and so on. Whatever is the reason, missing values affect the performance of the machine learning models.
Some machine learning platforms automatically drop the rows which include missing values in the model training phase and it decreases the model performance because of the reduced training size. On the other hand, most of the algorithms do not accept datasets with missing values and gives an error.
The most simple solution to the missing values is to drop the rows or the entire column. There is not an optimum threshold for dropping but you can use 70% as an example value and try to drop the rows and columns which have missing values with higher than this threshold.
threshold = 0.7
#Dropping columns with missing value rate higher than threshold
data = data[data.columns[data.isnull().mean() < threshold]]
#Dropping rows with missing value rate higher than threshold
data = data.loc[data.isnull().mean(axis=1) < threshold]
Numerical Imputation
Imputation is a more preferable option rather than dropping because it preserves the data size. However, there is an important selection of what you impute to the missing values. I suggest beginning with considering a possible default value of missing values in the column. For example, if you have a column that only has 1 and NA, then it is likely that the NA rows correspond to 0. For another example, if you have a column that shows the “customer visit count in last month”, the missing values might be replaced with 0 as long as you think it is a sensible solution.
Another reason for the missing values is joining tables with different sizes and in this case, imputing 0 might be reasonable as well.
Except for the case of having a default value for missing values, I think the best imputation way is to use the medians of the columns. As the averages of the columns are sensitive to the outlier values, while medians are more solid in this respect.
#Filling all missing values with 0
data = data.fillna(0)
#Filling missing values with medians of the columns
data = data.fillna(data.median())
Categorical Imputation
Replacing the missing values with the maximum occurred value in a column is a good option for handling categorical columns. But if you think the values in the column are distributed uniformly and there is not a dominant value, imputing a category like “Other” might be more sensible, because in such a case, your imputation is likely to converge a random selection.
#Max fill function for categorical columns
data['column_name'].fillna(data['column_name'].value_counts()
.idxmax(), inplace=True)
2.Handling Outliers
Before mentioning how outliers can be handled, I want to state that the best way to detect the outliers is to demonstrate the data visually. All other statistical methodologies are open to making mistakes, whereas visualizing the outliers gives a chance to take a decision with high precision. Anyway, I am planning to focus visualization deeply in another article and let’s continue with statistical methodologies.
Statistical methodologies are less precise as I mentioned, but on the other hand, they have a superiority, they are fast. Here I will list two different ways of handling outliers. These will detect them using standard deviation, and percentiles.
Outlier Detection with Standard Deviation
If a value has a distance to the average higher than x * standard deviation, it can be assumed as an outlier. Then what x should be?
There is no trivial solution for x, but usually, a value between 2 and 4 seems practical.
#Dropping the outlier rows with standard deviation
factor = 3
upper_lim = data['column'].mean () + data['column'].std () * factor
lower_lim = data['column'].mean () - data['column'].std () * factor
data = data[(data['column'] < upper_lim) & (data['column'] > lower_lim)]
In addition, z-score can be used instead of the formula above. Z-score (or standard score) standardizes the distance between a value and the mean using the standard deviation.
Outlier Detection with Percentiles
Another mathematical method to detect outliers is to use percentiles. You can assume a certain percent of the value from the top or the bottom as an outlier. The key point is here to set the percentage value once again, and this depends on the distribution of your data as mentioned earlier.
Additionally, a common mistake is using the percentiles according to the range of the data. In other words, if your data ranges from 0 to 100, your top 5% is not the values between 96 and 100. Top 5% means here the values that are out of the 95th percentile of data.
#Dropping the outlier rows with Percentiles
upper_lim = data['column'].quantile(.95)
lower_lim = data['column'].quantile(.05)
data = data[(data['column'] < upper_lim) & (data['column'] > lower_lim)]
An Outlier Dilemma: Drop or Cap
Another option for handling outliers is to cap them instead of dropping. So you can keep your data size and at the end of the day, it might be better for the final model performance.
On the other hand, capping can affect the distribution of the data, thus it better not to exaggerate it.
#Capping the outlier rows with Percentiles
upper_lim = data['column'].quantile(.95)
lower_lim = data['column'].quantile(.05)
data.loc[(df[column] > upper_lim),column] = upper_lim
data.loc[(df[column] < lower_lim),column] = lower_lim
3.Binning
Image for post
Binning illustration of numerical data
Binning can be applied on both categorical and numerical data:
#Numerical Binning Example
Value Bin
0-30 -> Low
31-70 -> Mid
71-100 -> High
#Categorical Binning Example
Value Bin
Spain -> Europe
Italy -> Europe
Chile -> South America
Brazil -> South America
The main motivation of binning is to make the model more robust and prevent overfitting, however, it has a cost to the performance. Every time you bin something, you sacrifice information and make your data more regularized. (Please see regularization in machine learning)
The trade-off between performance and overfitting is the key point of the binning process. In my opinion, for numerical columns, except for some obvious overfitting cases, binning might be redundant for some kind of algorithms, due to its effect on model performance.
However, for categorical columns, the labels with low frequencies probably affect the robustness of statistical models negatively. Thus, assigning a general category to these less frequent values helps to keep the robustness of the model. For example, if your data size is 100,000 rows, it might be a good option to unite the labels with a count less than 100 to a new category like “Other”.
#Numerical Binning Example
data['bin'] = pd.cut(data['value'], bins=[0,30,70,100], labels=["Low", "Mid", "High"])
value bin
0 2 Low
1 45 Mid
2 7 Low
3 85 High
4 28 Low
#Categorical Binning Example
Country
0 Spain
1 Chile
2 Australia
3 Italy
4 Brazil
conditions = [
data['Country'].str.contains('Spain'),
data['Country'].str.contains('Italy'),
data['Country'].str.contains('Chile'),
data['Country'].str.contains('Brazil')]
choices = ['Europe', 'Europe', 'South America', 'South America']
data['Continent'] = np.select(conditions, choices, default='Other')
Country Continent
0 Spain Europe
1 Chile South America
2 Australia Other
3 Italy Europe
4 Brazil South America
4.Log Transform
Logarithm transformation (or log transform) is one of the most commonly used mathematical transformations in feature engineering. What are the benefits of log transform:
It helps to handle skewed data and after transformation, the distribution becomes more approximate to normal.
In most of the cases the magnitude order of the data changes within the range of the data. For instance, the difference between ages 15 and 20 is not equal to the ages 65 and 70. In terms of years, yes, they are identical, but for all other aspects, 5 years of difference in young ages mean a higher magnitude difference. This type of data comes from a multiplicative process and log transform normalizes the magnitude differences like that.
It also decreases the effect of the outliers, due to the normalization of magnitude differences and the model become more robust.
A critical note: The data you apply log transform must have only positive values, otherwise you receive an error. Also, you can add 1 to your data before transform it. Thus, you ensure the output of the transformation to be positive.
Log(x+1)
#Log Transform Example
data = pd.DataFrame({'value':[2,45, -23, 85, 28, 2, 35, -12]})
data['log+1'] = (data['value']+1).transform(np.log)
#Negative Values Handling
#Note that the values are different
data['log'] = (data['value']-data['value'].min()+1) .transform(np.log)
value log(x+1) log(x-min(x)+1)
0 2 1.09861 3.25810
1 45 3.82864 4.23411
2 -23 nan 0.00000
3 85 4.45435 4.69135
4 28 3.36730 3.95124
5 2 1.09861 3.25810
6 35 3.58352 4.07754
7 -12 nan 2.48491
5.One-hot encoding
One-hot encoding is one of the most common encoding methods in machine learning. This method spreads the values in a column to multiple flag columns and assigns 0 or 1 to them. These binary values express the relationship between grouped and encoded column.
This method changes your categorical data, which is challenging to understand for algorithms, to a numerical format and enables you to group your categorical data without losing any information. (For details please see the last part of Categorical Column Grouping)
Image for post
One hot encoding example on City column
Why One-Hot?: If you have N distinct values in the column, it is enough to map them to N-1 binary columns, because the missing value can be deducted from other columns. If all the columns in our hand are equal to 0, the missing value must be equal to 1. This is the reason why it is called as one-hot encoding. However, I will give an example using the get_dummies function of Pandas. This function maps all values in a column to multiple columns.
encoded_columns = pd.get_dummies(data['column'])
data = data.join(encoded_columns).drop('column', axis=1)
6.Grouping Operations
In most machine learning algorithms, every instance is represented by a row in the training dataset, where every column show a different feature of the instance. This kind of data called “Tidy”.
Tidy datasets are easy to manipulate, model and visualise, and have a specific structure: each variable is a column, each observation is a row, and each type of observational unit is a table.
— Hadley Wickham
Datasets such as transactions rarely fit the definition of tidy data above, because of the multiple rows of an instance. In such a case, we group the data by the instances and then every instance is represented by only one row.
The key point of group by operations is to decide the aggregation functions of the features. For numerical features, average and sum functions are usually convenient options, whereas for categorical features it more complicated.
Categorical Column Grouping
I suggest three different ways for aggregating categorical columns:
The first option is to select the label with the highest frequency. In other words, this is the max operation for categorical columns, but ordinary max functions generally do not return this value, you need to use a lambda function for this purpose.
data.groupby('id').agg(lambda x: x.value_counts().index[0])
Second option is to make a pivot table. This approach resembles the encoding method in the preceding step with a difference. Instead of binary notation, it can be defined as aggregated functions for the values between grouped and encoded columns. This would be a good option if you aim to go beyond binary flag columns and merge multiple features into aggregated features, which are more informative.
Image for post
Pivot table example: Sum of Visit Days grouped by Users
#Pivot table Pandas Example
data.pivot_table(index='column_to_group', columns='column_to_encode', values='aggregation_column', aggfunc=np.sum, fill_value = 0)
Last categorical grouping option is to apply a group by function after applying one-hot encoding. This method preserves all the data -in the first option you lose some-, and in addition, you transform the encoded column from categorical to numerical in the meantime. You can check the next section for the explanation of numerical column grouping.
Numerical Column Grouping
Numerical columns are grouped using sum and mean functions in most of the cases. Both can be preferable according to the meaning of the feature. For example, if you want to obtain ratio columns, you can use the average of binary columns. In the same example, sum function can be used to obtain the total count either.
#sum_cols: List of columns to sum
#mean_cols: List of columns to average
grouped = data.groupby('column_to_group')
sums = grouped[sum_cols].sum().add_suffix('_sum')
avgs = grouped[mean_cols].mean().add_suffix('_avg')
new_df = pd.concat([sums, avgs], axis=1)
7.Feature Split
Image for postImage for post
Photo by Jaxon Lott on Unsplash
Splitting features is a good way to make them useful in terms of machine learning. Most of the time the dataset contains string columns that violates tidy data principles. By extracting the utilizable parts of a column into new features:
We enable machine learning algorithms to comprehend them.
Make possible to bin and group them.
Improve model performance by uncovering potential information.
Split function is a good option, however, there is no one way of splitting features. It depends on the characteristics of the column, how to split it. Let’s introduce it with two examples. First, a simple split function for an ordinary name column:
data.name
0 Luther N. Gonzalez
1 Charles M. Young
2 Terry Lawson
3 Kristen White
4 Thomas Logsdon
#Extracting first names
data.name.str.split(" ").map(lambda x: x[0])
0 Luther
1 Charles
2 Terry
3 Kristen
4 Thomas
#Extracting last names
data.name.str.split(" ").map(lambda x: x[-1])
0 Gonzalez
1 Young
2 Lawson
3 White
4 Logsdon
The example above handles the names longer than two words by taking only the first and last elements and it makes the function robust for corner cases, which should be regarded when manipulating strings like that.
Another case for split function is to extract a string part between two chars. The following example shows an implementation of this case by using two split functions in a row.
#String extraction example
data.title.head()
0 Toy Story (1995)
1 Jumanji (1995)
2 Grumpier Old Men (1995)
3 Waiting to Exhale (1995)
4 Father of the Bride Part II (1995)
data.title.str.split("(", n=1, expand=True)[1].str.split(")", n=1, expand=True)[0]
0 1995
1 1995
2 1995
3 1995
4 1995
8.Scaling
In most cases, the numerical features of the dataset do not have a certain range and they differ from each other. In real life, it is nonsense to expect age and income columns to have the same range. But from the machine learning point of view, how these two columns can be compared?
Scaling solves this problem. The continuous features become identical in terms of the range, after a scaling process. This process is not mandatory for many algorithms, but it might be still nice to apply. However, the algorithms based on distance calculations such as k-NN or k-Means need to have scaled continuous features as model input.
Basically, there are two common ways of scaling:
Normalization
Image for post
Normalization (or min-max normalization) scale all values in a fixed range between 0 and 1. This transformation does not change the distribution of the feature and due to the decreased standard deviations, the effects of the outliers increases. Therefore, before normalization, it is recommended to handle the outliers.
data = pd.DataFrame({'value':[2,45, -23, 85, 28, 2, 35, -12]})
data['normalized'] = (data['value'] - data['value'].min()) / (data['value'].max() - data['value'].min())
value normalized
0 2 0.23
1 45 0.63
2 -23 0.00
3 85 1.00
4 28 0.47
5 2 0.23
6 35 0.54
7 -12 0.10
Standardization
Standardization (or z-score normalization) scales the values while taking into account standard deviation. If the standard deviation of features is different, their range also would differ from each other. This reduces the effect of the outliers in the features.
In the following formula of standardization, the mean is shown as μ and the standard deviation is shown as σ.
Image for post
data = pd.DataFrame({'value':[2,45, -23, 85, 28, 2, 35, -12]})
data['standardized'] = (data['value'] - data['value'].mean()) / data['value'].std()
value standardized
0 2 -0.52
1 45 0.70
2 -23 -1.23
3 85 1.84
4 28 0.22
5 2 -0.52
6 35 0.42
7 -12 -0.92
9.Extracting Date
Though date columns usually provide valuable information about the model target, they are neglected as an input or used nonsensically for the machine learning algorithms. It might be the reason for this, that dates can be present in numerous formats, which make it hard to understand by algorithms, even they are simplified to a format like "01–01–2017".
Building an ordinal relationship between the values is very challenging for a machine learning algorithm if you leave the date columns without manipulation. Here, I suggest three types of preprocessing for dates:
Extracting the parts of the date into different columns: Year, month, day, etc.
Extracting the time period between the current date and columns in terms of years, months, days, etc.
Extracting some specific features from the date: Name of the weekday, Weekend or not, holiday or not, etc.
If you transform the date column into the extracted columns like above, the information of them become disclosed and machine learning algorithms can easily understand them.
from datetime import date
data = pd.DataFrame({'date':
['01-01-2017',
'04-12-2008',
'23-06-1988',
'25-08-1999',
'20-02-1993',
]})
#Transform string to date
data['date'] = pd.to_datetime(data.date, format="%d-%m-%Y")
#Extracting Year
data['year'] = data['date'].dt.year
#Extracting Month
data['month'] = data['date'].dt.month
#Extracting passed years since the date
data['passed_years'] = date.today().year - data['date'].dt.year
#Extracting passed months since the date
data['passed_months'] = (date.today().year - data['date'].dt.year) * 12 + date.today().month - data['date'].dt.month
#Extracting the weekday name of the date
data['day_name'] = data['date'].dt.day_name()
date year month passed_years passed_months day_name
0 2017-01-01 2017 1 2 26 Sunday
1 2008-12-04 2008 12 11 123 Thursday
2 1988-06-23 1988 6 31 369 Thursday
3 1999-08-25 1999 8 20 235 Wednesday
4 1993-02-20 1993 2 26 313 Saturday
Conclusion
Image for post
👉 Stock Price Prediction – Machine Learning Project in Python
Stock Price Machine learning has significant applications in the stock price prediction. In this machine learning project, we will be talking about predicting the returns on stocks. This is a very complex task and has uncertainties. We will develop this project into two parts:
First, we will learn how to predict stock price using the LSTM neural network.
Then we will build a dashboard using Plotly dash for stock analysis.
https://data-flair.training/blogs/stock-price-prediction-machine-learning-project-in-python/
Stock Price Prediction Project
Datasets
To build the stock price prediction model, we will use the NSE TATA GLOBAL dataset. This is a dataset of Tata Beverages from Tata Global Beverages Limited, National Stock Exchange of India: Tata Global Dataset
To develop the dashboard for stock analysis we will use another stock dataset with multiple stocks like Apple, Microsoft, Facebook: Stocks Dataset
Source Code
Before proceeding ahead, please download the source code: Stock Price Prediction Project
Stock price prediction using LSTM step by step explanation
1. Imports:
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
from matplotlib.pylab import rcParams
rcParams['figure.figsize']=20,10
from keras.models import Sequential
from keras.layers import LSTM,Dropout,Dense
from sklearn.preprocessing import MinMaxScaler
2. Read the dataset:
df=pd.read_csv("NSE-TATA.csv")
df.head()
3. Analyze the closing prices from dataframe:
df["Date"]=pd.to_datetime(df.Date,format="%Y-%m-%d")
df.index=df['Date']
plt.figure(figsize=(16,8))
plt.plot(df["Close"],label='Close Price history')
4. Sort the dataset on date time and filter “Date” and “Close” columns:
data=df.sort_index(ascending=True,axis=0)
new_dataset=pd.DataFrame(index=range(0,len(df)),columns=['Date','Close'])
for i in range(0,len(data)):
new_dataset["Date"][i]=data['Date'][i]
new_dataset["Close"][i]=data["Close"][i]
5. Normalize the new filtered dataset:
scaler=MinMaxScaler(feature_range=(0,1))
final_dataset=new_dataset.values
train_data=final_dataset[0:987,:]
valid_data=final_dataset[987:,:]
new_dataset.index=new_dataset.Date
new_dataset.drop("Date",axis=1,inplace=True)
scaler=MinMaxScaler(feature_range=(0,1))
scaled_data=scaler.fit_transform(final_dataset)
x_train_data,y_train_data=[],[]
for i in range(60,len(train_data)):
x_train_data.append(scaled_data[i-60:i,0])
y_train_data.append(scaled_data[i,0])
x_train_data,y_train_data=np.array(x_train_data),np.array(y_train_data)
x_train_data=np.reshape(x_train_data,(x_train_data.shape[0],x_train_data.shape[1],1))
6. Build and train the LSTM model:
lstm_model=Sequential()
lstm_model.add(LSTM(units=50,return_sequences=True,input_shape=(x_train_data.shape[1],1)))
lstm_model.add(LSTM(units=50))
lstm_model.add(Dense(1))
inputs_data=new_dataset[len(new_dataset)-len(valid_data)-60:].values
inputs_data=inputs_data.reshape(-1,1)
inputs_data=scaler.transform(inputs_data)
lstm_model.compile(loss='mean_squared_error',optimizer='adam')
lstm_model.fit(x_train_data,y_train_data,epochs=1,batch_size=1,verbose=2)
7. Take a sample of a dataset to make stock price predictions using the LSTM model:
X_test=[]
for i in range(60,inputs_data.shape[0]):
X_test.append(inputs_data[i-60:i,0])
X_test=np.array(X_test)
X_test=np.reshape(X_test,(X_test.shape[0],X_test.shape[1],1))
predicted_closing_price=lstm_model.predict(X_test)
predicted_closing_price=scaler.inverse_transform(predicted_closing_price)
8. Save the LSTM model:
lstm_model.save("saved_model.h5")
9. Visualize the predicted stock costs with actual stock costs:
train_data=new_dataset[:987]
valid_data=new_dataset[987:]
valid_data['Predictions']=predicted_closing_price
plt.plot(train_data["Close"])
plt.plot(valid_data[['Close',"Predictions"]])
You can observe that LSTM has predicted stocks almost similar to actual stocks.
Build the dashboard using Plotly dash
In this section, we will build a dashboard to analyze stocks. Dash is a python framework that provides an abstraction over flask and react.js to build analytical web applications.
Before moving ahead, you need to install dash. Run the below command in the terminal.
pip3 install dash
pip3 install dash-html-components
pip3 install dash-core-components
Now make a new python file stock_app.py and paste the below script:
import dash
import dash_core_components as dcc
import dash_html_components as html
import pandas as pd
import plotly.graph_objs as go
from dash.dependencies import Input, Output
from keras.models import load_model
from sklearn.preprocessing import MinMaxScaler
import numpy as np
app = dash.Dash()
server = app.server
scaler=MinMaxScaler(feature_range=(0,1))
df_nse = pd.read_csv("./NSE-TATA.csv")
df_nse["Date"]=pd.to_datetime(df_nse.Date,format="%Y-%m-%d")
df_nse.index=df_nse['Date']
data=df_nse.sort_index(ascending=True,axis=0)
new_data=pd.DataFrame(index=range(0,len(df_nse)),columns=['Date','Close'])
for i in range(0,len(data)):
new_data["Date"][i]=data['Date'][i]
new_data["Close"][i]=data["Close"][i]
new_data.index=new_data.Date
new_data.drop("Date",axis=1,inplace=True)
dataset=new_data.values
train=dataset[0:987,:]
valid=dataset[987:,:]
scaler=MinMaxScaler(feature_range=(0,1))
scaled_data=scaler.fit_transform(dataset)
x_train,y_train=[],[]
for i in range(60,len(train)):
x_train.append(scaled_data[i-60:i,0])
y_train.append(scaled_data[i,0])
x_train,y_train=np.array(x_train),np.array(y_train)
x_train=np.reshape(x_train,(x_train.shape[0],x_train.shape[1],1))
model=load_model("saved_model.h5")
inputs=new_data[len(new_data)-len(valid)-60:].values
inputs=inputs.reshape(-1,1)
inputs=scaler.transform(inputs)
X_test=[]
for i in range(60,inputs.shape[0]):
X_test.append(inputs[i-60:i,0])
X_test=np.array(X_test)
X_test=np.reshape(X_test,(X_test.shape[0],X_test.shape[1],1))
closing_price=model.predict(X_test)
closing_price=scaler.inverse_transform(closing_price)
train=new_data[:987]
valid=new_data[987:]
valid['Predictions']=closing_price
df= pd.read_csv("./stock_data.csv")
app.layout = html.Div([
html.H1("Stock Price Analysis Dashboard", style={"textAlign": "center"}),
dcc.Tabs(id="tabs", children=[
dcc.Tab(label='NSE-TATAGLOBAL Stock Data',children=[
html.Div([
html.H2("Actual closing price",style={"textAlign": "center"}),
dcc.Graph(
id="Actual Data",
figure={
"data":[
go.Scatter(
x=train.index,
y=valid["Close"],
mode='markers'
)
],
"layout":go.Layout(
title='scatter plot',
xaxis={'title':'Date'},
yaxis={'title':'Closing Rate'}
)
}
),
html.H2("LSTM Predicted closing price",style={"textAlign": "center"}),
dcc.Graph(
id="Predicted Data",
figure={
"data":[
go.Scatter(
x=valid.index,
y=valid["Predictions"],
mode='markers'
)
],
"layout":go.Layout(
title='scatter plot',
xaxis={'title':'Date'},
yaxis={'title':'Closing Rate'}
)
}
)
])
]),
dcc.Tab(label='Facebook Stock Data', children=[
html.Div([
html.H1("Facebook Stocks High vs Lows",
style={'textAlign': 'center'}),
dcc.Dropdown(id='my-dropdown',
options=[{'label': 'Tesla', 'value': 'TSLA'},
{'label': 'Apple','value': 'AAPL'},
{'label': 'Facebook', 'value': 'FB'},
{'label': 'Microsoft','value': 'MSFT'}],
multi=True,value=['FB'],
style={"display": "block", "margin-left": "auto",
"margin-right": "auto", "width": "60%"}),
dcc.Graph(id='highlow'),
html.H1("Facebook Market Volume", style={'textAlign': 'center'}),
dcc.Dropdown(id='my-dropdown2',
options=[{'label': 'Tesla', 'value': 'TSLA'},
{'label': 'Apple','value': 'AAPL'},
{'label': 'Facebook', 'value': 'FB'},
{'label': 'Microsoft','value': 'MSFT'}],
multi=True,value=['FB'],
style={"display": "block", "margin-left": "auto",
"margin-right": "auto", "width": "60%"}),
dcc.Graph(id='volume')
], className="container"),
])
])
])
@app.callback(Output('highlow', 'figure'),
[Input('my-dropdown', 'value')])
def update_graph(selected_dropdown):
dropdown = {"TSLA": "Tesla","AAPL": "Apple","FB": "Facebook","MSFT": "Microsoft",}
trace1 = []
trace2 = []
for stock in selected_dropdown:
trace1.append(
go.Scatter(x=df[df["Stock"] == stock]["Date"],
y=df[df["Stock"] == stock]["High"],
mode='lines', opacity=0.7,
name=f'High {dropdown[stock]}',textposition='bottom center'))
trace2.append(
go.Scatter(x=df[df["Stock"] == stock]["Date"],
y=df[df["Stock"] == stock]["Low"],
mode='lines', opacity=0.6,
name=f'Low {dropdown[stock]}',textposition='bottom center'))
traces = [trace1, trace2]
data = [val for sublist in traces for val in sublist]
figure = {'data': data,
'layout': go.Layout(colorway=["#5E0DAC", '#FF4F00', '#375CB1',
'#FF7400', '#FFF400', '#FF0056'],
height=600,
title=f"High and Low Prices for {', '.join(str(dropdown[i]) for i in selected_dropdown)} Over Time",
xaxis={"title":"Date",
'rangeselector': {'buttons': list([{'count': 1, 'label': '1M',
'step': 'month',
'stepmode': 'backward'},
{'count': 6, 'label': '6M',
'step': 'month',
'stepmode': 'backward'},
{'step': 'all'}])},
'rangeslider': {'visible': True}, 'type': 'date'},
yaxis={"title":"Price (USD)"})}
return figure
@app.callback(Output('volume', 'figure'),
[Input('my-dropdown2', 'value')])
def update_graph(selected_dropdown_value):
dropdown = {"TSLA": "Tesla","AAPL": "Apple","FB": "Facebook","MSFT": "Microsoft",}
trace1 = []
for stock in selected_dropdown_value:
trace1.append(
go.Scatter(x=df[df["Stock"] == stock]["Date"],
y=df[df["Stock"] == stock]["Volume"],
mode='lines', opacity=0.7,
name=f'Volume {dropdown[stock]}', textposition='bottom center'))
traces = [trace1]
data = [val for sublist in traces for val in sublist]
figure = {'data': data,
'layout': go.Layout(colorway=["#5E0DAC", '#FF4F00', '#375CB1',
'#FF7400', '#FFF400', '#FF0056'],
height=600,
title=f"Market Volume for {', '.join(str(dropdown[i]) for i in selected_dropdown_value)} Over Time",
xaxis={"title":"Date",
'rangeselector': {'buttons': list([{'count': 1, 'label': '1M',
'step': 'month',
'stepmode': 'backward'},
{'count': 6, 'label': '6M',
'step': 'month',
'stepmode': 'backward'},
{'step': 'all'}])},
'rangeslider': {'visible': True}, 'type': 'date'},
yaxis={"title":"Transactions Volume"})}
return figure
if __name__=='__main__':
app.run_server(debug=True)
Now run this file and open the app in the browser:
python3 stock_app.py
Summary
Stock price prediction is a machine learning project for beginners; in this tutorial we learned how to develop a stock cost prediction model and how to build an interactive dashboard for stock analysis. We implemented stock market prediction using the LSTM model. OTOH, Plotly dash python framework for building dashboards.
==============================================================================================================================================================
$pip3 install pillow #image library
$pip3 install psycopg2 #connector to db
Include this in settings.py
#app definition
INSTALLED APP=[
'travel.apps.Travel.config',
]
Adding image in models.py
class Destination(models.Model):
img=models.ImageField(upload_to='pics')
@@@@@@@@@
Making random migrations
$ python3 manage.py makemigrations
$ python3 manage.py migrate
Making specific migrations
$ python3 manage.py makemigrations
$ python3 manage.py migrate name_ data 0001_initial
@@@@@@@@@
DJANGO && SERVICE WORKER
from django.views.generic import TemplateView
urlpatterns = [
url(r'^sw.js', (TemplateView.as_view(template_name="sw.js", content_type='application/javascript', )), name='sw.js'),
]
👉 DJANGO && MONGODB CONNECTION
1)django_mongodb_engine
2)djongo -> transpiler you write sql queries and it transpiles to mongo nosql queries
1)django_mongodb_engine ->check https://django-mongodb-engine.readthedocs.io/en/latest/topics/setup.html
Django-nonrel, a fork of Django that adds support for non-relational databases
Djangotoolbox, a bunch of utilities for non-relational Django applications and backends
Django-nonrel
pip install git+https://github.com/django-nonrel/[email protected]
Djangotoolbox
pip install git+https://github.com/django-nonrel/djangotoolbox
Django MongoDB Engine
You should use the latest Git revision.
pip install git+https://github.com/django-nonrel/mongodb-engine
Configuration
Database setup is easy (see also the Django database setup docs):
DATABASES = {
'default' : {
'ENGINE' : 'django_mongodb_engine',
'NAME' : 'my_database'
}
}
Django MongoDB Engine also takes into account the HOST, PORT, USER, PASSWORD and OPTIONS settings.
Possible values of OPTIONS are described in the settings reference.
2)djongo ->check https://pypi.org/project/djongo/
Install djongo:
pip install djongo
Into settings.py file of your project, add:
DATABASES = {
'default': {
'ENGINE': 'djongo',
'NAME': 'your-db-name',
}
}
Run (ONLY the first time to create collections in mongoDB):
manage.py makemigrations
manage.py migrate
YOUR ARE SET! HAVE FUN!
Requirements
Djongo requires python 3.6 or above.
How it works
Djongo is a SQL to mongodb query transpiler. It translates a SQL query string into a mongoDB query document. As a result, all Django features, models etc work as is.
Django contrib modules:
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.sessions',
and others� fully supported.
Important links
Full Documentation
Source code
👉 DJANGO && EMAIL CONNECTION
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'portfolio',
'USER':'dan',
'PASSWORD':'phoenix',
'HOST':'127.0.0.1',
'PORT':'5432',
}
}
SMTP IN DJANGO
SETTINGS.PY
EMAIL_BACKEND = "django.core.mail.backends.smtp.EmailBackend"
EMAIL_HOST = "smtp.gmail.com"
EMAIL_USE_TLS = True
EMAIL_PORT = 587
EMAIL_HOST_USER = "�[email protected]"
EMAIL_HOST_PASSWORD = "your account'�s password"
((( OR )))
EMAIL_BACKEND = "django.core.mail.backends.smtp.EmailBackend"
EMAIL_HOST = "smtp.gmail.com"
EMAIL_USE_SSL = True
EMAIL_PORT = 465
EMAIL_HOST_USER = "[email protected]"
EMAIL_HOST_PASSWORD = "your account's password"
VIEWS.PY
import django
from django.core.mail import send_mail
from django.conf import settings
def email(request):
subject = 'Thank you for registering to our site'
message = ' It is great choosing to work with a team of perfectionists'
email_from = settings.EMAIL_HOST_USER
recipient_list = ['[email protected]',]
send_mail( subject, message, email_from, recipient_list )
return redirect('redirect to a new page')
COMMON DB (PLATFORM DEPENDENT)
Relational db: postreSQL
Document db: CouchDB
Graph and cloud web db: Cassandra
MULTI-PURPOSE (PLATFORM AGNOSTIC)
General purpose db: SurrealDB
👉 PGADMIN4 INSTALLATION PREVIOUS CRITERIA
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
sudo apt-get install virtualenv python3-pip libpq-dev python3-dev
cd
Create env: virtualenv -p python3 pgadmin4
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
cd pgadmin4
source bin/activate
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
pip3 install https://ftp.postgresql.org/pub/pgadmin/pgadmin4/v4.2/pip/pgadmin4-4.2-py2.py3-none-any.whl
Configure
Override default paths and set it to single-user mode in the local configuration file:
For Python2.x
nano lib/python2.7/site-packages/pgadmin4/config_local.py
For Python3.x
nano lib/python3.6/site-packages/pgadmin4/config_local.py
Write:
import os
DATA_DIR = os.path.realpath(os.path.expanduser(u'~/.pgadmin/'))
LOG_FILE = os.path.join(DATA_DIR, 'pgadmin4.log')
SQLITE_PATH = os.path.join(DATA_DIR, 'pgadmin4.db')
SESSION_DB_PATH = os.path.join(DATA_DIR, 'sessions')
STORAGE_DIR = os.path.join(DATA_DIR, 'storage')
SERVER_MODE = False
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
python3 lib/python3.6/site-packages/pgadmin4/pgAdmin4.py
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
(Browser URL + PORT)
name DB1
connection(host name/addrress) 127.0.0.1/localhost
username tecmint
passwd securep@wd
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
deactivate
-----------------------------------------------------------------
UBUNTU
sudo apt install postgresql postgresql-contrib
apt-get install postgresql -y
-----------------------------------------------------------------
FEDORA
sudo dnf install postgresql postgresql-contrib
-----------------------------------------------------------------
sudo systemctl start postgresql
sudo systemctl enable postgresql
systemctl status postgresql
ss -antpl | grep 5432
pg_isready
Connect to postgres and creating db: sudo -i -u postgres
psql
CREATE DATABASE myproject;
CREATE USER myprojectuser WITH PASSWORD 'password';
ALTER ROLE myprojectuser SET client_encoding TO 'utf8';
ALTER ROLE myprojectuser SET default_transaction_isolation TO 'read committed';
ALTER ROLE myprojectuser SET timezone TO 'UTC';
GRANT ALL PRIVILEGES ON DATABASE myproject TO myprojectuser;
\q
exit
nano /etc/postgresql/12/main/pg_hba.conf
systemctl restart postgresql
👉 PGADMIN4 INSTALLATION NEW CRITERIA
# Install the public key for the repository (if not done previously):
sudo curl https://www.pgadmin.org/static/packages_pgadmin_org.pub | sudo apt-key add
# Create the repository configuration file:
sudo sh -c 'echo "deb https://ftp.postgresql.org/pub/pgadmin/pgadmin4/apt/$(lsb_release -cs) pgadmin4 main" > /etc/apt/sources.list.d/pgadmin4.list && apt update'
#
# Install pgAdmin
#
# Install for both desktop and web modes:
sudo apt install pgadmin4
# Install for desktop mode only:
sudo apt install pgadmin4-desktop
# Install for web mode only:
sudo apt install pgadmin4-web
# Configure the webserver, if you installed pgadmin4-web:
sudo /usr/pgadmin4/bin/setup-web.sh
http://127.0.0.1/pgadmin4
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
(Browser URL + PORT)
host name DB1
connection(host name/addrress) 127.0.0.1/localhost
maintenance database postgres
username tecmint
passwd securep@wd
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
deactivate
--------------------------------------------------
Get current info: \conninfo
Postgresql commands
List all databases: \l.
Exit out of help menu: \q
Connect to database: \c database_name
List tables in current database: \dt
List columns in a table: \d table_name
See a list of all psql commands: \? (Press the down arrow to scroll through, or q to exit list.)
exit
########################################################
POSTGRES LOGINS
1. Postgres Web Credentials / Application Credentials**Prefer**
2. Postgres sudo - sudo password
3. Postgres user - Enter a new password as postgres
FORGOTTEN PASSWD/ SET NEW PASSWD ON POSTGRES
sudo -i -u postgres
could not change directory to "/root"
psql (9.1.11)
Type "help" for help.
postgres=# \password
Enter new password:
Enter it again:
postgres=# \q
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
WORKBENCH INSTALLATION
sudo wget https://www.sql-workbench.eu/Workbench-Build125-with-optional-libs.zip
unzip Workbench-Build125-with-optional-libs.zip -d Desktop/Workbench
->install java
wget -P Desktop/PostgreJDBC https://jdbc.postgresql.org/download/postgresql-42.2.10.jar
cd Desktop/Workbench
sudo sh sqlworkbench.sh
jdbc:postgresql://your.database.server.ip:port.number/name_of_database
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
👉 WINDOWS FIX FOR POSTGRES CONFIGURATION
By default the PostgreSQL server will listen only on a local loop-back interface 127.0.0.1. If you need to configure your PostreSQL server to listen on all networks you will need to configure its main configuration file /etc/postgresql/10/main/postgresql.conf:
$ sudo nano /etc/postgresql/10/main/postgresql.conf
and add the following line somewhere to the CONNECTIONS AND AUTHENTICATION section:
listen_addresses = '*'
sudo service postgresql restart
systemctl status couchdb.service
Run on browser URL:
-> http://localhost:5984/_utils/index.html#login
Username: admin
Password: "..."
-> http://localhost:5984/_utils/index.html#verifyinstall
sudo apt install clang
sudo bash -c "$(wget -O - https://apt.llvm.org/llvm.sh)"
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% ------------------LOGIN LINUX & WINDOWS--------------------
PASSWORD - devopsengineerDan PIN - Dan@1997
ROOT - devopsengineer
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
👉 UEFI SETTINGS
Recovery -> Advanced Startup -> Advanced Options -> Uefi Settings
👉 WINDOWS OPERATING SYSTEM
WINDOWS 10 => OLD HARDWARE
WINDOWS LATEST RELEASE => NEWER HARDWARE
👉 WINDOWS 11 ACTIVATION
Windows 11 Home: TX9XD-98N7V-6WMQ6-BX7FG-H8Q99
Windows 11 Home N: 3KHY7-WNT83-DGQKR-F7HPR-844BM
Windows 11 Pro: W269N-WFGWX-YVC9B-4J6C9-T83GX
Windows 11 Pro N: MH37W-N47XK-V7XM9-C7227-GCQG9
Windows 11 Education: NW6C2-QMPVW-D7KKK-3GKT6-VCFB2
Windows 11 Education N: 2WH4N-8QGBV-H22JP-CT43Q-MDWWJ
For example, if you want to activate Windows 11 Pro, use the command
Step 1: slmgr /ipk W269N-WFGWX-YVC9B-4J6C9-T83GX
Step 2: slmgr /skms kms.msguides.com OR slmgr /skms kms8.msguides.com
Step 3: slmgr /ato
----------------------------------------------------------------------------------------------
👉 LOCAL ACCOUNT LOG IN TO WINDOWS 11
In windows 11 pro
-Sign in options
-Domain join instead
-Enter Local user name and password
In windows 11 home
-Press shift+F10 orin other computers shift+Fn+F10
-In the cmd type:
netsh interface show interface
netsh interface set interface <interface-name> disable
exit
-Press back icon
-Enter Local user name and password
->To reenable the network Go to:
Network & Internet
Advanced network settings
Network adapters
Enable the network adapter you disabled ethernet/wifi
Then:
-Sign in options
-Domain join instead
-Enter Local user name and password
👉 ANDROID MOBILE PHONE
Samsung Sound settings -> Separate Apps feature ( Separate Audio to a specific app And Bluetooth Audio to a specific app)
Samsung Music Settings-> Crossfade
Onstream/ Netflix => Movies Streaming
SPlayer => Movies/Video Player
SnapTube => Music Downloader
Lark Player Studio/ Samsung Music => Music Player
TetherFi=> Wifi hotspot
👉 WINDOWS SOFTWARES
Download cpu-z -> ram,cpu,graphics type and storage size
Download Intel Graphics Command Center and Modify Settings to High/Ultra
--------------------------------------------------------------------------
🧨🌀 SYSTEM HARDENING
=>Glass Wire Network Security => Firewall
=>Use latest Wifi/ Internet Security Standard e.g WPA2
=>PortMaster -> Block advertisement, deception, tracker => VPN
=>Windows Defender
=>Avast Offline
=>WPD -> Remove logs and DISABLE WINDOWS UPDATE
👉 System Protection ON
👉 BITLOCKER
https://account.microsoft.com/devices/recoverykey
Microsoft Acc: [email protected]
KEY ID-> 8FFABF93
RECOVERY KEY -> 330803-554235-096261-513920-197538-644919-496243-011418
Microsoft Acc: [email protected]
KEY ID->
RECOVERY KEY ->
--------------------------------------------------------------------------
Screen recorder pro -> Screenshot
RecForth -> Video Recorder
Remote Desktop
Gihosoft Tubeget -> 4k,8k download
IDM
4k Downloader
--------------------------------------------------------------------------
Chrome
Firefox => SaveFrom.net helper extension, DuckDuckGo Privacy Essentials extension, Netcraft Extension
Brave
Microsoft Night Ligt OR Flux
Windows Powershell ISE -> Advance
VirtualBox
Windows Powershell
Anaconda
Pycharm Community
*Rustrover
*Visual Studio Installer
Cmake
Conan
Eclipse -> c/c++ development, embedded c/c++, scientific c/c++ -> mingw-get-setup.exe + Go Advanced System Settings then Go to Environment variables then Go to System variables, Select Path then Select edit, Add C:\MinGW\bin, Save then OK and Restart computer
Whatsapp
Teams=> Windows 10
Telegram
Skype
Bluestacks
One Note for Windows 10
Netflix
MPCH(Media Classic Home)
👉 OPENSOURCE 👉 PROPRIETORY
logseq/appflowy/anytype obsidian/Notion -> Note taking
mattermost slack/microsoftteams
focalboard trello
libreoffice adobe
vscodium vscode
nextcloud googleworksheet/ microsoft365
thunderbird gmail/outlook
👉 MICROSOFT VISUAL STUDIO INSTALLER (Individual Components)
IntelliCode
Windows Universal C Runtime
Security Issue Analysis
Windows 10 SDK
Windows 11 SDK
Embedded and IOT Tools
Windows Performance Toolkit
USB Device Connectivity
👉 AI TOOLS For Business Optimization
Tinywow Collection of AI Tools => https://tinywow.com/
Free Pik => https://www.freepik.com/free-photos-vectors/letterhead
@@@@@@@@@@@@@ EntryPointAI.com/Masterclass
👉 EDITING TOOLS
PDF, IMAGE, WORD CONVERTER => https://online2pdf.com/
👉 CRACKED SOFTWARES
1. 1337x.to OR 1337x.to/#
2. WINDOWS OFFICE SETUP -> https://getintopc.com/?s=Office+2016&submit=Search
3. GAME -> https://gametrex.com
👉 WATCHING MOVIIES SERIES TV
PUTLOCKER -> https://putlockernew.vc/home
IMDb -> https://www.imdb.com/
STREMIO -> [email protected] -> QWERTY@12345a -> INSTALL STREMIO ON DESKTOP https://torrentio.strem.fun/configure
YOUTUBE CHANNEL FullHDvideos4me -> https://www.youtube.com/@FullHDvideos4me
👉 WATCHING GAMES
Cricify TV
👉 DOWNLOAD MOVIES
goojara.to
👉 NETWORK RESET
Reset network to default to remove previous configurations
👉 STOPPING AUTOUPDATE IN WINDOWS
Go to Settings.
Go to Network and Internet
Go to Properties
Turn on Metered Connection to limit the bandwidth
👉 COMPUTER SECURITY
Glass Wire Network Security
Virus Total
👉 SUSPEND AND HIBERNATE
SUSPEND -> DO NOTHING - To enable operating the laptop when you close the lid
HIBERNATE -> ctrl + Delete to run the machine
👉AVOID OVERHEATING
*RESTART MACHINE TO FREE UP MEMORY, THUS AVOID LAGGNG OF PERFORMANCE
*CLOSE MACHINE IF NOT IN USE
👉 CHECK TYPE OF SSD
Go to cmd
wmic diskdrive get caption, firmwarerevision
👉 FIX SSD
chkdsk
👉 CHECK TRIM TECHNOLOGY IMPLEMENTATION
Go to cmd
Check - fsutil behavior query DisableDeleteNotify
Enable - fsutil behavior set DisableDeleteNotify 0
Disable - fsutil behavior set DisableDeleteNotify 1
👉 DISABLE AUTO-UPDATES
Windows + R - services.msc -> Windows update -> Disable
👉 KEYBOARD SHORTCUTS
Fn + Alt + F4 -> Close window and Shut Down or Reboot
Windows + R - wmic -> bios get serialnumber
Windows + R - winver -> To check Windows OS Build
Windows + R - dxdiag
Windows + R- %temp% -> Delete temp files
Windows + R - optionalfeatures.exe -> Windows Subsystem for Linux
👉 FIX FLASH ON WINDOWS TERMINAL
diskpart
list disk
- attributes disk 2 - select disk 2
- attributes disk 2 clear diskonly - clean
- exit - exit
👉 WINDOWS (UBUNTU TERMINAL)
cd /mnt/c/Users/Dan/
cd /mnt/c/Users/Dan/Desktop/Projects/
👉 SETUP WINDOWS HELLO AND FINGERPRINT
1. Update Your PC
2. Enable Biometric from Administration
3. Update Your Fingerprint Device Driver
[h]First of all, make sure that you’ve installed any possible update that is available for Windows 10. See if you have needed patches installed[/h]:
1) On your keyboard, press the Windows logo key and I at the same time. Click Update & security.
2) Click the Check for updates button. Install available updates found.
When you finish, move on to the second step.
[h]If you don’t have the administrator permission, you’re not likely to use Windows Hello on your PC. To fix it: [/h]
1) On your keyboard, press the Windows logo key and R at the same time, then type gpedit.msc and press Enter.
2) Follow the path: Computer Configuration > Administrative Templates > Windows Components. On the right side of the pane, double-click Allow the use of biometrics.
3) Click Enabled. Then click Apply and OK to save the change.
4) Reboot your computer and then move on to the next step.
[h]If your fingerprint device driver is outdated or faulty, you’ll find the hardware impossible to use. You can update the device driver to fix it.[/h]
1) On your keyboard, press the Windows logo key and X at the same time, and click Device Manager.
2) Click the icon for Scan for hardware changes on the top bar.
3) Expand Biometric devices. Right-click the fingerprint sensor device that you have and click Uninstall.
4) You can choose to reboot your computer for Windows to help you locate the driver that it could find for your device or, you can choose to update the driver by yourself.
👉 AUTO-MOBILE
===> Engine and Differential-lock[front wheels, back wheels, all wheels]
===> Tok and Horse-power
===> Size and Shape
👉 DRIVING
AVOID OVERTAKING -> Hill, Barriers, Blind corner
SIDE MIRROR -> Junction, Round-about, Entry/Exit way
GO SLOW -> Pedestrian-crossing + Bumps, Bridge, Parking
👉 Tourbillon WATCH 👉 Japanese Kei off-road truck 4x4 👉 Facebook: mercelsha muqpal 👉 Dell xps 13" 9310 i7 11th gen OR BEST 15" 9530 i9 13th gen[RAZOR,APPLE,DELL XPS,X1 CARBON THINKPAD]
🧨🧨🧨🧨🧨
> AVOID PHALLOIDIN && PARABENS TOXINS
> AVOID ANTI- PROSPRANE ROLLONS
> AVOID TAMPONS, BENZENE ALSO..
🧨🧨🧨🧨🧨
> USE CARBON FILTERS FOR WATER
> GET RID OF FRAGRANCE PRODUCTS
> TAKE ORGANIC FOODS AND DRINKS
> USE ORGANIC PADS
> USE COMFORTABLE BRAS
👉 HARDWARE
1 Original Type-C cable(including Original Fast-charger header)
2 Type-C usb port extension(including ethernet port)
3 usbC-nvme stick
4 Portable mifi
5 Power-bank
==> Portable [COMPANY] 5G mifi(All lines)
==> Samsung 990 Pro ssd(Tails,Kali, Kali Purple, Windows10 Recovery iso)
👉 The following are further optimization settings that can be figured once your BIOS is modded.
- These settings are universally recommended optimizations for your hackintosh:
graph TD
id1[Advance Tab]-->id2[Intel Advanced Menu];
id2-->id3[System Agent Configuration];
id2-->id4[Power & Performance];
id3-->id5[Graphics Configuration];
id4-->id6[CPU - Power Management Control];
id5-->|DVMT Pre-Allocated|64M;
id6-->id7[CPU Lock Configuration];
id7-->|CFG Lock|Disabled;
- I also recommend undervolting your machine regarless of your usage, the following are stable settings for my x1c6 with
i7-8650U
, verified by stress testing withPrime95
andHeaven Benchmark
, your may be worse or better, please do your own testing. In addition, I suggest you repaste your machine with an aftermarket thermal paste for lower temps and a better undervolt.
graph TD
id1[Advance Tab]-->id2[Intel Advanced Menu];
id2-->id3[OverClocking Performance Menu];
id3-->|OverClocking Feature|Enabled;
id3-->Processor;
id3-->GT;
id3-->Uncore;
Processor-->|Voltage Offset|100;
Processor-->|Offset Prefix|id4[-];
GT-->|GT Voltage Offset|id8[80];
GT-->|Offset Prefix|id5[-];
GT-->|GTU Voltage Offset|id8;
GT-->|Offset Prefix|id5;
Uncore-->|Uncore Voltage Offset|80;
Uncore-->|Offset Prefix|id6[-];
-
The following settings depend on your own personal preference:
-
If you want to optimize CPU performance at the cost of battery:
graph LR
id1[Advance Tab]-->id2[Power & Performance];
id2-->id3[CPU - Power Management Control];
id3-->|Boot Performance Mode|id4[Turbo Performance];
id3-->|Config TDP Configurations|Up;
- If you want to optimize battery time at the cost of performance:
graph LR
id1[Advance Tab]-->id2[Power & Performance];
id2-->id3[CPU - Power Management Control];
id3-->|Boot Performance Mode|id4[Max Battery];
id3-->|Config TDP Configurations|Down;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
1) Youtube vids
2) 9 Vids + Project management
3)
*Google Drive + ($AI + 2 projects Two Factor Authentication Systems + PysparkAirflow + kivy salesapp + blog backend python API + nairobi weather graphql API + home lights + Neural Network Scratch+ Data analysis cleaning + Data analysis ETL + Data analysis API Disaster alert system +python wrapper for sendy's API $ +remove gmail), UP COMPLETE
4) * SAF LINE BURUBURU*
5)
1st Day -> Django API,Flask API, C, [RUST]
2nd Day -> Python, Rust ->(5Life + Life) & Career
6) Mathbook -> Discrete, Calculus, Statistics, Probability, Linear Algebra, Graphs
7) Project Management
SOFTWARE
1 Python(Logistic & Linear regression,RNN)
2 Python Data Cleaning, RustLeaf & RustTensorflow
3 Django Tenants
4 Graphql Nairobi water*, Blog backend
5 MPESA API integration system
6 Python wrapper for Sendy & AI Monitoring System
7 **Home Lights**
9 Python & C open source engineer
VIDEOS + 2GIT + Printing + (Block Chain & Datastructures Summary Table)
NetworkDesign
FINAL -> (Network design + [(Python&CSockets)+UTP] ) & [Cybersecurity] & (Home Lights + [2AI, Python, Rust, C] )
-> {2Networks+kvmConfigurations, *Subnetting*}Cybersecurity
-> Network Security + Algorithm Design
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
-> Overview: CybersecurityVIDEOS
10hrs Cybersecurity-> https://www.youtube.com/watch?v=JcaYwZ5fqjU
Ethical Hacking in 15 Hours - 2023 Edition - Learn to Hack! (Part 1) -> https://www.youtube.com/watch?v=3FNYvj2U0HM
Ethical Hacking in 15 Hours - 2023 Edition - Learn to Hack! (Part 2) -> https://www.youtube.com/watch?v=sH4JCwjybGs
PwnFunction
[Web Security]
Open Redirect Vulnerability Explained -> https://www.youtube.com/watch?v=4Jk_I-cw4WE&list=PLI_rLWXMqpSl_TqX9bbisW-d7tDqcVvOJ
HTTP Parameter Pollution Explained -> https://www.youtube.com/watch?v=QVZBl8yxVX0&list=PLI_rLWXMqpSl_TqX9bbisW-d7tDqcVvOJ&index=2
Insecure Direct Object Reference (IDOR) Explained -> https://www.youtube.com/watch?v=rloqMGcPMkI&list=PLI_rLWXMqpSl_TqX9bbisW-d7tDqcVvOJ&index=3
Cross-Site Request Forgery (CSRF) Explained -> https://www.youtube.com/watch?v=eWEgUcHPle0&list=PLI_rLWXMqpSl_TqX9bbisW-d7tDqcVvOJ&index=5
Cross-Site Scripting (XSS) Explained -> https://www.youtube.com/watch?v=EoaDgUgS6QA&list=PLI_rLWXMqpSl_TqX9bbisW-d7tDqcVvOJ&index=7
Insecure Deserialization Attack Explained -> https://www.youtube.com/watch?v=jwzeJU_62IQ&list=PLI_rLWXMqpSl_TqX9bbisW-d7tDqcVvOJ&index=9
Don't make random HTTP requests -> https://www.youtube.com/watch?v=RCJdPiogUIk&list=PLI_rLWXMqpSl_TqX9bbisW-d7tDqcVvOJ&index=10
[Binary Exploitation]
What are Executables? | bin 0x00 -> https://www.youtube.com/watch?v=WnqOhgI_8wA&list=PLI_rLWXMqpSkAYfar0HRA7lykydwmRY_2&index=1
How some functions can be Dangerous | bin 0x01 -> https://www.youtube.com/watch?v=EJtUW2AklVs&list=PLI_rLWXMqpSkAYfar0HRA7lykydwmRY_2&index=3
Why you should Close Your Files | bin 0x02 -> https://www.youtube.com/watch?v=6SA6S9Ca5-U&list=PLI_rLWXMqpSkAYfar0HRA7lykydwmRY_2&index=3
Dangerous Code Hidden in Plain Sight for 12 years -> https://www.youtube.com/watch?v=6SA6S9Ca5-U&list=PLI_rLWXMqpSkAYfar0HRA7lykydwmRY_2&index=3
==================================================> Further: Hackersploit (BLUE AND RED TEAM)
==================================================> 1ST FREE TIME Network Security Videos
=> API
Patrick Loeber(Django, FastAPI, Flask)
software-project-based-learning GitHub
--------------
Django https://codeburst.io/create-a-django-api-in-under-20-minutes-2a082a60f6f3
Flask https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-xxiii-application-programming-interfaces-apis
--------------
Rodney Osodo(API, Python Wrapper)
GraphQL
Fireship -> https://www.youtube.com/watch?v=eIQh02xuVw4
NetNinja -> https://www.youtube.com/watch?v=xMCnDesBggM&list=PL4cUxeGkcC9gUxtblNUahcsg0WLxmrK_y
=> DATABASE
1. DATABASE Normalization(1-4) & SQL Tutorial -> https://www.youtube.com/watch?v=y03oYWDLu0Q&list=PLTd6ceoshprdZIu24-_u00-m2Vf9o5aVC
2. SQL Tutorial(Overview) -> https://www.youtube.com/watch?v=kXQccIONcc4&list=PL7yh-TELLS1FVLG6AFoT-3lVcbJiPTWze
=> EDR Artificial Intelligence System
Prevention -> https://github.com/devopsengineerDan/cybersecurity-IPS-IntrusionPreventionSystem
Detection -> https://github.com/devopsengineerDan/cybersecurity-maltrail
https://devopsengineerdan.github.io/ My blog name devopsengineerdan which is dev Repository/alexloannides blog -> ML Pipeline part 1 and 2
https://github.com/devopsengineerDan/pyspark-ETL-project-and-airflow-Data-pipeline
=> Cybersecurity Objective UP TOP
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
@ AI TOOLS For Business Optimization
EntryPointAI.com/Masterclass
Tinywow.com
@ IOT FUNDAMENTALS 2 WEEKS
Espressif Systems => Getting started with ESP-IDF -> https://www.youtube.com/watch?v=J8zc8mMNKtc
👉 ADVANCE USER => Embedotronics Technologies => ESP-IDF tutorials -> https://www.youtube.com/watch?v=fD7B9LI45Rk&list=PLmiBQYp1gExu4n2svyvwGCMv0geIOTw0X
Science Buddies => How to use a breadboard -> https://www.youtube.com/watch?v=6WReFkfrUIk
LeftyMaker => How to use a breadboard -> https://www.youtube.com/watch?v=W6mixXsn-Vc
--------------------------
IOT
===> First Build
Flush + Press Reset Button
Build
Run Program
IOT
===> PUTTY
#!/bin/bash
sudo chmod 666 /dev/ttyUSB0
putty
--------------------------
1ST PROJECT
Low Level Learning - esp32 setup -> https://www.youtube.com/watch?v=dOVjb2wXI84
Low Level Learning - esp32 wireless network connectivity and TCPIP connection -> https://www.youtube.com/watch?v=_dRrarmQiAM
***Low Level Learning - IOT Kettle Heater Project via Wifi and TCPIP -> https://www.youtube.com/watch?v=8-JE3RdyEds
2ND PROJECT
***IOT HOME LIGHT SYSTEM -> https://github.com/devopsengineerDan/automating-home-lights-python
@ API FUNDAMENTALS 2 WEEKS
=> Building a Restful API
Django
Flask
fastAPI
Rodney API
Django Github API => Consuming a Restful API
Flask News API => Consuming a Restful API
SendyAPI Wrapper => Consuming a Restful API + Building an Restful API
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
👉 2 ELECTRONIC PROJECTS + [TV Guard] + {Electric cable connection + Tester Screw driver} && TOOLS
{
Fan Blower + Tool box + Carbon Fiber Electrostatic Discharge Anti-Static ESD Gloves - Polyurethane Palm Coated
Voltage Tester + Solder Tool
🧨 COMPUTER MOTHERBOARD ELECTRONICS GUIDE -> https://m.youtube.com/@electronicsrepairbasics_erb
}
👉 2 CERTS
After getting 1ST CERT = DESTROY PAPER in the Folder.
AFTER getting 2ND CERT = REST.
👉 WIRING *******************************
👉 ASSEMBLE DESKTOP SERVER [INSTALLATION, MAINTENANCE AND CUSTOMIZATION] *******************************
👉 SOLDERING OF NON-CRUSIAL SEMI-CONDUCTORS *******************************
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
👉 sterlingtuitioncentre0.github.io
{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}
[email protected]
CateMolly1992
👉 www.namecheap.com
SterlingManager
MANAGER@987#
👉 TRADING
https://www.henriotgroup.com/
https://www.gammadeltafund.com/
https://www.tradingview.com/symbols/AUDUSD/?utm_campaign=ticker-tape&utm_medium=widget_new&utm_source=www-gammadeltafund-com.filesusr.com
https://t.me/thegammadeltafund
##############################################################################################################################################################################################################################################
===============================================================================================
30M => 10M[2Bedroom @ 10M] * 1[HOUSE] + 20M[2Bedroom @ 8/9M EACH] * 2[HOUSES] ===> SPACIOUS HOUSE + PREMIUM WELL-ESTABLISHED AREA ===> EXTRA HIGH-END
=> 4M/2M BALANCE
=> Week=75K Month=300K 12%=Year=3.6M 9Years=32,400,000 + 300,000 BONUS
MALINDI => [BLUFF NGO OWNERSHIP] ===> CONCRETE-WIRE FENCE + TREE PLANTING + FARMING(1Manager and 1Dweller)
1=> 1M 25 HECTARES => MINE FOREX ----> MALINDI HIGH-END RENTAL RESIDENTIAL APARTMENTS
2=> 1M 25 HECTARES => DAD and Refund After a Year 100K Per Month FROM DAD'S BUSKET ----> MALINDI HIGH-END RENTAL RESIDENTIAL APARTMENTS
MALINDI => [BLUFF NGO OWNERSHIP] ===> CONCRETE-WIRE FENCE + TREE PLANTING + FARMING(1Manager and 2Dwellers)
=> 4M/2M BALANCE => 4M 100 HECTARES => 4M BALANCE
1=> 2M --> 50 HECTARES => HIGH-END HOTEL RESORT + GOLF COURSE
2=> 2M --> 50 HECTARES => 25 HECTARES ----> MALINDI HIGH-END RENTAL RESIDENTIAL APARTMENTS => 25 HECTARES ----> MALINDI HIGH-END RENTAL RESIDENTIAL APARTMENTS
===============================================================================================
PROFOUND WISDOM
30M --> 8M,8M 7M,7M 👉👉👉 ===> 8M[1.Brookside, 2. WAITUpperhill, 3. WAITNyari OR Enaki !!!!!! PLACE NOT KILELESHWA,] 7M[1. Garden City, 2. Junction Mall, 3. Muthaiga]
👉15M --> 8M 7M 👉10M 1M --> ZOLLY FEE 1M --> ZOLLY + MOM 2YRS RENT + 15K 2YRS UPKEEP 1=> 1M 25 HECTARES 2=> 1M 25 HECTARES 3=> 4M 100 HECTARES 1M --> HOUSE UTILITY BACKUP 1M --> FENCE
"""""""""""" 1M --> DAD'S LOAN 50K PER MONTH 3M --> 1 BEDROOM OFFICE --> RENT THE PREMISE 1M --> 500K SCHOOL / 500K SECURITY """"""""""""" 👉👉👉 TOTAL => 5M FOREX COMMITMENT 👉 4M 4 LANDS DOWN HERE..... 👉 1M FENCE
30M => 10M[2Bedroom @ 10M] * 1[HOUSE] ===> SPACIOUS HOUSE + PREMIUM WELL-ESTABLISHED AREA ===> EXTRA HIGH-END => BACKUP @@@ TOP CREAM => 15M [2Bedroom @ 5M EACH] * 3[HOUSES] ===> SPACIOUS HOUSE + PREMIUM WELL-ESTABLISHED AREA ===> HIGH-END => 4M [4 Seperate Lands REMOTE LARGE pieces of lands @ 1M EACH] * 4[MASSIVE LANDS] ===> CONCRETE-WIRE FENCE + TREE PLANTING + FARMING(1Manager and 4Dwellers)
=> 1M BALANCE OF SECOND DEAL1[SWANAPOLE DAD LAND NAIVASHA @ 1M] * 1[MASSIVE LAND] ===> CONCRETE-WIRE FENCE + TREE PLANTING + FARMING(1Manager and 1Dweller) + CONTINUOUS BASKET FILLING
=> Week=75K Month=300K 12%=Year=3.6M 9Years=32,400,000 + 300,000 BONUS
30M => 10M[2Bedroom @ 10M] * 1[HOUSE] ===> SPACIOUS HOUSE + PREMIUM WELL-ESTABLISHED AREA ===> EXTRA HIGH-END => 33 ENGINEER @@@ TOP CREAM
=> 1M[ZOLLY {COMPLETE SCHOOL FEES}]
=> 1M SUPPORT OF [ZOLLY + BERRETTA {HOUSE RENT 2 YEARS + 2 YEARS LIVELIHOOD}]
*******=> GOAL *******===> (ZOLLY + MOM) UNTIL SCHOOL ENDS, NEED ZOLLY EXPERTISE(ACCOUNTING, TAX, ASSET MANAGEMENT----> International Certification in the 3)
=> 1M[HEALTH COVER + INSURANCE COVER ZOLLY + MOM]
-----> TOTAL 3M
-----> BALLANCE 10M-3M=7M / 10M-2M=8M
=> 10M BALANCE OF SECOND DEAL2 -----> => 17M/18M [2 Seperate REMOTE VERY LARGE **THE BIGGEST***pieces of lands @ ***RANCH*** 17M/18M] * 1[RANCH LAND] ===> CONCRETE-WIRE FENCE + TREE PLANTING + FARMING(1Manager and 3Dwellers)
=> Week=75K Month=300K 12%=Year=3.6M 9Years=32,400,000 + 300,000 BONUS
30M => 10M[2Bedroom @ 10M] * 1[HOUSE] ===> SPACIOUS HOUSE + PREMIUM WELL-ESTABLISHED AREA ===> EXTRA HIGH-END => SOFTWARE ENGINEER @@@ TOP CREAM => 10M [2 Seperate REMOTE VERY LARGE pieces of lands @ RANCH 5M EACH] * 2[RANCH LANDS] ===> CONCRETE-WIRE FENCE + TREE PLANTING + FARMING(1Manager and 2Dweller)
=> 10M BALANCE OF SECOND DEAL3 -----> => 10M [2 Seperate REMOTE VERY LARGE pieces of lands @ ***RANCH*** 5M EACH] * 2[RANCH LANDS] ===> CONCRETE-WIRE FENCE + TREE PLANTING + FARMING(1Manager and 2Dweller)
=> Week=75K Month=300K 12%=Year=3.6M 9Years=32,400,000 + 300,000 BONUS
30M => 10M[2Bedroom @ 10M] * 1[HOUSE] ===> SPACIOUS HOUSE + PREMIUM WELL-ESTABLISHED AREA ===> EXTRA HIGH-END => SECURITY ENGINEER @@@ TOP CREAM
=> 20M BALANCE OF SECOND DEAL4
=> Week=75K Month=300K 12%=Year=3.6M 9Years=32,400,000 + 300,000 BONUS
30M => 10M[2Bedroom @ 10M] * 1[HOUSE] ===> SPACIOUS HOUSE + PREMIUM WELL-ESTABLISHED AREA ===> EXTRA HIGH-END => ALPHA AND OMEGA ENGINEER @@@ TOP CREAM
=> 20M BALANCE OF SECOND DEAL5[10M{GROUND-WORK + MISCELLENEOUS} + 10M{RESORT GET-AWAY + CLASSROOMS SECTION}]
=> Week=75K Month=300K 12%=Year=3.6M 9Years=32,400,000 + 300,000 BONUS
===============================================================================================
2M => 2M[1Bedroom @ 2M] * 1[HOUSE] ===> SPACIOUS HOUSE + PREMIUM WELL-ESTABLISHED AREA => EXTRA HIGH-END => Week=5K Month=20K 12%=Year=240K 9Years=2,160,000 + 20,000 BONUS
===============================================================================================
2 HOUSES
1ST HOUSE => CONTINUOUS BASKET FILLING + MARKET SHOCK ABSORBER
2ND HOUSE => DAD 1M LOAN => DAD GOES --> CUT COST OF HOUSE ZOLLY MOM + BERRETTA POULTRY BUSINESS => CONTINUOUS BASKET FILLING + MARKET SHOCK ABSORBER
BACKUP LIFE BASKET => USED1M BALANCE OF SECOND DEAL1 + USED17/18M BALANCE OF SECOND DEAL2 + USED10M BALANCE OF SECOND DEAL3 + 20M BALANCE OF SECOND DEAL4 + USED20M BALANCE OF SECOND DEAL5 => FOREX CONTINUOUS INCOME
RENT BASKET => 120K => 1YEAR RENT=2025 => 120K => 1YEAR=2026 => 120K => 1YEAR RENT[1EXTRA]=2027
LIVELIHOOD BASKET => 120K + 120K MEDICAL COVER => 120K DAD LOAN => HOME UTILITY SECURITY -------------------------> 50k per month 1yr 8mths DAD => INVESTMENT => LOAN PAY PER MONTH => RELATIONSHIP => PER MONTH => MEDICAL COVER => PER MONTH => MISCELLENEOUS => PER MONTH
WIVES => PER MONTH --------------------------------------> 10k per month 2yrs 6 mths => INVESTMENT => PER MONTH => RELATIONSHIP => PER MONTH => MEDICAL COVER => PER MONTH => MISCELLENEOUS => PER MONTH
FAMILY => INVESTMENT => PER MONTH => RELATIONSHIP => PER MONTH => MEDICAL COVER => PER MONTH => MISCELLENEOUS => PER MONTH
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
=> CREATE ON 2025 & DELETE ON 2035 LIGHT TECHNOLOGIES LTD ACCOUNT/ REALTORS ACCOUNT ---> DTB ACC [TRANSFER RANGE BETWEEN 700K - 800K] => CREATE ON 2025 & DELETE ON 2035 HENRIOT REALTORS ACCOUNT ---> DTB ACC [TRANSFER RANGE BETWEEN 700K - 800K] %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
=> DAWN(LION GROUP MANAGEMENT LTD) ACCOUNT ----> (SAVINGS, TAX AVOIDANCE NOT EVASION, BANK MONEY MARKET FUND----> International Certification in the 3) => EAGLE MANAGEMENT LTD ACCOUNT REGISTERED in Sychelles ----> 🧨HAVE ALL REQUIRED LICENSES IN SEYCHELLES && STATIONED COUNTRY, Now Kenya🧨 ----> Receive revenue from JOHNSON REALTORS ACCOUNT ----> Transfer Received Revenue to DAWN ACCOUNT => DTB SAVINGS ACCOUNT ----> (SAVINGS----> International Certification in the 1)
=> DAWN DANIEL ACCOUNT ----> (ENGINEERING[SOFTWARES && LABS(TELECOMMUNICATION/CLOUD SOFTWARE+HARDWARE/CYBERSECURITY)]----> International Certification in the 8)
DAWN DANIEL NGO ACCOUNT
DAWN DANIEL NGO ACCOUNT
=> DAWN MURUNGI ACCOUNT ----> (ENGINEERING[BUILDING CONSTRUCTION, ROADS]----> International Certification in the 2)
DAWN MURUNGI NGO ACCOUNT
=> DAWN SWANAPOLE ACCOUNT ----> (IT, FLOWERS, APPLE, AVOCADO, WATER----> International Certification in the 5)
DAWN SWANAPOLE NGO ACCOUNT
=> DAWN JOHNSON MANAGEMENT ACCOUNT ----> (ACCOUNTING, TAX, ASSET MANAGEMENT----> International Certification in the 3)
----> 🧨HAVE LICENSES IN ACCOUNTING, TAX, ASSET MANAGEMENT, HR, REALTORS🧨
---->🧨🧨🧨HAVE LICENSE IN HENRIOT REALTORS LTD🧨 => THEN STOP BUSINESS IN 2034🧨🧨🧨
CREATE ON 2035 JOHNSON REALTORS ACCOUNT ----> 🧨HAVE LICENSE IN REALTORS🧨
CREATE ON 2035 DAWN JOHNSON NGO ACCOUNT ----> 🧨HAVE LICENSE IN NGO🧨
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
BIG LANDS MALINDI ----> [HOTEL/RESTAURANT && GOLF COURSE] 2 RANCHES ----> [HOTEL/RESTAURANT && GOLF COURSE] + SCHOOL&&CHURCH 2 RANCHES ----> [DRAGON FRUIT, AVOCADO && APPLE FARM] + [GOATS, SHEEP, RABBITS, GOOSE, DUCKS, POULTRY FARM] 1 SOLID BIGGEST RANCH ----> [TREES && HAY SELLING FARM]
2025 => B => FINANCIAL ENGINEERING [DOC + SCRIPTS] [100K + 300K] -> If it goes as planned save CASH 2027 => M => BUSINESS ADMINISTRATION [DOC + SCRIPTS] [100K + 300K] -> If it goes as planned give out [300K + 300K = 600K CASH NEXT YEAR]
2028 => GIVE 600K CASH TO SYSTEM JAMES + VACATE RUAI HOUSE
===> HAVE 1. BROKERAGE Institution + 2. SECURITIES Institution + 3. LIQUIDITY Institution ===> START FAMILY
2025 + 9 = 2034 2034 DEC => TERMINATE CONTRACT with my 2 CLIENTS ------> MAINTAIN CONNECTION WITH 2 CLIENTS => FREE MAN From my 2 CLIENTS => WITHDRAW From HENRIOT GROUP GAMMA DELTA + WITHDRAW INVESTED MONEY + INTEREST ------> MAINTAIN CONNECTION WITH HENRIOT GROUP GAMMA DELTA PROFESSIONALS
***=> DELETE LIGHT TECHNOLOGIES ACCOUNT
***=> DELETE [REALTORS + NGO] ACCOUNT
***=> MAINTAIN DTB ACCOUNT
2035 =====> LET EVERYTHING COOL
2036 =====> RETRIEVE OWNERSHIP FROM ZOLLY
2036 ===> HAVE 1. BROKERAGE Institution + 2. SECURITIES Institution + 3. LIQUIDITY Institution ===> MAINTAIN FAMILY ===> ENTER GOVERNMENT KANU
🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨
=====> BANK
=====> CHURCH
-------
=====> EMPLOYEE SAVINGS
=====> NGO
🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨🧨