├── Chaper1 ├── .gitignore ├── .github ├── CODEOWNERS ├── PULL_REQUEST_TEMPLATE.md ├── workflows │ └── main.yml └── ISSUE_TEMPLATE.md ├── CONTRIBUTING.md ├── NOTICE ├── Chapter2 ├── README.md └── 02_05 ├── README.md ├── Chapter4 └── 04_05.md ├── Chapter3 └── 03_05.md └── LICENSE /Chaper1: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | .DS_Store 2 | node_modules 3 | .tmp 4 | npm-debug.log 5 | -------------------------------------------------------------------------------- /.github/CODEOWNERS: -------------------------------------------------------------------------------- 1 | # Codeowners for these exercise files: 2 | # * (asterisk) denotes "all files and folders" 3 | # Example: * @producer @instructor 4 | -------------------------------------------------------------------------------- /.github/PULL_REQUEST_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /.github/workflows/main.yml: -------------------------------------------------------------------------------- 1 | name: Copy To Branches 2 | on: 3 | workflow_dispatch: 4 | jobs: 5 | copy-to-branches: 6 | runs-on: ubuntu-latest 7 | steps: 8 | - uses: actions/checkout@v2 9 | with: 10 | fetch-depth: 0 11 | - name: Copy To Branches Action 12 | uses: planetoftheweb/copy-to-branches@v1.2 13 | env: 14 | key: main 15 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | 2 | Contribution Agreement 3 | ====================== 4 | 5 | This repository does not accept pull requests (PRs). All pull requests will be closed. 6 | 7 | However, if any contributions (through pull requests, issues, feedback or otherwise) are provided, as a contributor, you represent that the code you submit is your original work or that of your employer (in which case you represent you have the right to bind your employer). By submitting code (or otherwise providing feedback), you (and, if applicable, your employer) are licensing the submitted code (and/or feedback) to LinkedIn and the open source community subject to the BSD 2-Clause license. 8 | -------------------------------------------------------------------------------- /NOTICE: -------------------------------------------------------------------------------- 1 | Copyright 2025 LinkedIn Corporation 2 | All Rights Reserved. 3 | 4 | Licensed under the LinkedIn Learning Exercise File License (the "License"). 5 | See LICENSE in the project root for license information. 6 | 7 | Please note, this project may automatically load third party code from external 8 | repositories (for example, NPM modules, Composer packages, or other dependencies). 9 | If so, such third party code may be subject to other license terms than as set 10 | forth above. In addition, such third party code may also depend on and load 11 | multiple tiers of dependencies. Please review the applicable licenses of the 12 | additional dependencies. 13 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | 7 | 8 | ## Issue Overview 9 | 10 | 11 | ## Describe your environment 12 | 13 | 14 | ## Steps to Reproduce 15 | 16 | 1. 17 | 2. 18 | 3. 19 | 4. 20 | 21 | ## Expected Behavior 22 | 23 | 24 | ## Current Behavior 25 | 26 | 27 | ## Possible Solution 28 | 29 | 30 | ## Screenshots / Video 31 | 32 | 33 | ## Related Issues 34 | 35 | -------------------------------------------------------------------------------- /Chapter2/README.md: -------------------------------------------------------------------------------- 1 | 2 | Self Healing : Block the IPs which are flagged in the NRF Log 3 | 4 | # Make sure `df` already exists and anomaly detection has been run 5 | 6 | # Step 1: Regenerate original category mapping 7 | df['src_ip'] = df['src_ip'].astype('category') # Just to be safe 8 | src_ip_mapping = dict(enumerate(df['src_ip'].cat.categories)) 9 | 10 | # Step 2: Find repeated offenders 11 | anomalies = df[df['anomaly'] == -1] 12 | offenders = anomalies['src_ip_code'].value_counts() 13 | THRESHOLD = 3 # Adjust as needed 14 | trigger_ips = offenders[offenders > THRESHOLD].index.tolist() 15 | 16 | # Step 3: Simulate self-healing response 17 | print("🚨 Triggered Self-Healing Response:") 18 | if not trigger_ips: 19 | print("✅ No offending IPs exceeded threshold.") 20 | else: 21 | for ip_code in trigger_ips: 22 | ip_str = src_ip_mapping.get(ip_code, f"[Unknown IP code {ip_code}]") 23 | print(f"🚫 Blocking IP: {ip_str} (Simulated IPTables or ACL)") 24 | ✅ Example Output You Should See 25 | yaml 26 | Copy 27 | Edit 28 | 🚨 Triggered Self-Healing Response: 29 | 🚫 Blocking IP: 172.31.99.99 (Simulated IPTables or ACL) 30 | 🚫 Blocking IP: 10.100.200.1 (Simulated IPTables or ACL) 31 | 🚫 Blocking IP: 198.51.100.42 (Simulated IPTables or ACL) 32 | 🚫 Blocking IP: 10.246.3.107 (Simulated IPTables or ACL) 33 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # AI for Telecom: Network Optimization & Security in 5G/Edge Systems 2 | This is the repository for the LinkedIn Learning course `AI for Telecom: Network Optimization & Security in 5G/Edge Systems`. The full course is available from [LinkedIn Learning][lil-course-url]. 3 | 4 | ![lil-thumbnail-url] 5 | 6 | ## Course Description 7 | 8 | Unlock the power of AI in telecom with this hands-on course designed for both technical and non-technical audiences. Learn how AI optimizes networks, enhances security, and drives automation in 5G, ORAN, IoT, and edge computing environments. Gain practical skills to design and deploy AI-driven solutions using tools like Python, TensorFlow, and MATLAB. Discover how to evaluate AI’s impact on business performance, scalability, and compliance, empowering you to turn insights into action and future-proof telecom operations. 9 | 10 | 11 | ## Instructor 12 | 13 | Taha Sajid 14 | 15 | Founder of XecurityPulse | Principal Architect | Author | EB1A Coach 16 | 17 | 18 | Check out my other courses on [LinkedIn Learning](https://www.linkedin.com/learning/instructors/taha-sajid?u=104). 19 | 20 | 21 | [0]: # (Replace these placeholder URLs with actual course URLs) 22 | 23 | [lil-course-url]: https://www.linkedin.com/learning/ai-for-telecom-network-optimization-and-security-in-5g-edge-systems 24 | [lil-thumbnail-url]: https://media.licdn.com/dms/image/v2/D4E0DAQHQTPKa1YPriA/learning-public-crop_675_1200/B4EZc3kca6HcAg-/0/1748983990595?e=2147483647&v=beta&t=OO6KmTNsSyFxh1eyeKqkH54me8pd2Zhg-yTqxUbFQMo 25 | 26 | -------------------------------------------------------------------------------- /Chapter2/02_05: -------------------------------------------------------------------------------- 1 | 2 | Objective: To build an AI-based system using Isolation Forest that can analyze Free5GC NRF logs and automatically flag suspicious or abnormal traffic behavior using pattern recognition 3 | 4 | Step 1: Upload your log file, for example NRF_LOG 5 | ----------------------------------------------- 6 | 7 | from google.colab import files 8 | uploaded = files.upload() 9 | 10 | 11 | Step 2: Parse Log Lines into a Structured Table 12 | ---------------------------------------------------- 13 | 14 | import re 15 | import pandas as pd 16 | 17 | logfile = "nrf_logs.txt" # ✅ Your actual file name 18 | 19 | log_data = [] 20 | 21 | with open(logfile, 'r') as f: 22 | for line in f: 23 | # ⚠️ Sample pattern — adjust based on your actual log format 24 | # Looking for format like: [time] src=... dst=... method=... uri=... 25 | match = re.search(r'\[(.*?)\].*?src=(.*?) dst=(.*?) method=(\w+) uri=(\S+)', line) 26 | if match: 27 | timestamp = match.group(1) 28 | src_ip = match.group(2) 29 | dst_ip = match.group(3) 30 | method = match.group(4) 31 | uri = match.group(5) 32 | log_data.append([timestamp, src_ip, dst_ip, method, uri]) 33 | 34 | df = pd.DataFrame(log_data, columns=['timestamp', 'src_ip', 'dst_ip', 'method', 'uri']) 35 | df.head() 36 | 37 | Step 3: Step 3: Encode Categorical Fields for Machine Learning 38 | ----------------------------------------------------------------- 39 | 40 | # Convert categorical fields to numeric codes 41 | df['src_ip'] = df['src_ip'].astype('category').cat.codes 42 | df['method'] = df['method'].astype('category').cat.codes 43 | df['uri'] = df['uri'].astype('category').cat.codes 44 | df['status'] = df['status'].astype(int) 45 | 46 | # Extract timestamp features (like seconds since first request) 47 | df['timestamp'] = pd.to_datetime(df['timestamp']) 48 | df['relative_time'] = (df['timestamp'] - df['timestamp'].min()).dt.total_seconds() 49 | 50 | # Select features for ML 51 | features = df[['src_ip', 'method', 'uri', 'status', 'relative_time']] 52 | features.head() 53 | 54 | 55 | Step 4: Run the Isolation Forest Algorithm 56 | ------------------------------------------------- 57 | What: Use IsolationForest(contamination=0.05) to flag rare or unusual request patterns. 58 | Why: Isolation Forest is unsupervised and great at detecting outliers — especially in high-volume, unlabeled datasets. 59 | 60 | from sklearn.cluster import KMeans 61 | import matplotlib.pyplot as plt 62 | 63 | # Cluster into 3 traffic behavior types 64 | kmeans = KMeans(n_clusters=3, random_state=0) 65 | df['cluster'] = kmeans.fit_predict(features) 66 | 67 | # Plot the clusters 68 | plt.figure(figsize=(10, 6)) 69 | plt.scatter(df['relative_time'], df['src_ip'], c=df['cluster'], cmap='viridis', s=50) 70 | plt.xlabel('Time Since Start (s)') 71 | plt.ylabel('Source IP (coded)') 72 | plt.title('NRF Log Traffic Clustered by Behavior') 73 | plt.grid(True) 74 | plt.show() 75 | 76 | Step 5: Interpret the Anomalies 77 | ----------------------------------- 78 | 79 | from sklearn.ensemble import IsolationForest 80 | 81 | # IsolationForest to find outliers in traffic behavior 82 | model = IsolationForest(contamination=0.05, random_state=0) 83 | df['anomaly'] = model.fit_predict(features) 84 | 85 | # Show anomalous requests 86 | anomalies = df[df['anomaly'] == -1] 87 | print("⚠️ Detected Anomalies:") 88 | print(anomalies[['timestamp', 'src_ip', 'method', 'uri', 'status']]) 89 | 90 | 91 | 92 | -------------------------------------------------------------------------------- /Chapter4/04_05.md: -------------------------------------------------------------------------------- 1 | Objective: Build a Telecom Root Cause Analysis (RCA) Assistant using LLM + RAG 2 | 3 | Step 1: Install Required Packages 4 | ------------------------------------ 5 | 6 | !pip install -U langchain langchain-community chromadb sentence-transformers openai gradio 7 | 8 | 9 | Step 2: Set Your Groq API Key 10 | ------------------ 11 | 12 | import os 13 | from openai import OpenAI 14 | 15 | os.environ["GROQ_API_KEY"] = "your-groq-api-key" # Replace with your real key 16 | 17 | client = OpenAI( 18 | base_url="https://api.groq.com/openai/v1", 19 | api_key=os.environ["GROQ_API_KEY"] 20 | ) 21 | 22 | 23 | Step 3: Upload and Parse NRF Logs 24 | -------------------------------------- 25 | 26 | from google.colab import files 27 | from langchain.docstore.document import Document 28 | import re 29 | 30 | uploaded = files.upload() 31 | log_file = next(iter(uploaded)) 32 | 33 | documents = [] 34 | with open(log_file, "r") as f: 35 | for line in f.readlines(): 36 | match = re.match(r"(\d{4}-\d{2}-\d{2}T[^\s]+)\s\[INFO\]\[NRF\]\[GIN\]\s\|\s(\d{3})\s\|\s([\d\.]+)\s\|\s([A-Z]+)\s\|\s([^|]+)", line) 37 | if match: 38 | documents.append(Document( 39 | page_content=line.strip(), 40 | metadata={ 41 | "timestamp": match.group(1), 42 | "status_code": match.group(2), 43 | "ip": match.group(3), 44 | "method": match.group(4), 45 | "endpoint": match.group(5) 46 | } 47 | )) 48 | 49 | Step 4: Setup Chroma Vector Store 50 | --------------------------------------- 51 | 52 | from langchain.embeddings import HuggingFaceEmbeddings 53 | from langchain.vectorstores import Chroma 54 | 55 | embedding_model = HuggingFaceEmbeddings(model_name="all-MiniLM-L6-v2") 56 | db = Chroma.from_documents(documents, embedding_model) 57 | retriever = db.as_retriever() 58 | 59 | 60 | Step5: Custom LangChain-Compatible Wrapper for Groq Client 61 | ------------------------------------------------------------- 62 | 63 | def get_contextual_rag_prompt(query: str, context_docs: list) -> str: 64 | context = "\n".join([doc.page_content for doc in context_docs]) 65 | return f""" 66 | You are a 5G security analyst analyzing NRF logs. 67 | 68 | CONTEXT: 69 | {context} 70 | 71 | QUESTION: 72 | {query} 73 | 74 | Provide a detailed RCA with actionable insights. 75 | """.strip() 76 | 77 | Step 6: Retrieve, Generate RCA Using Groq + LLaMA3 78 | -------------------------------------------------- 79 | 80 | # Sample RCA query 81 | query = "Explain why IP 10.100.200.1 caused an NRF fault" 82 | 83 | # Retrieve top documents 84 | relevant_docs = retriever.get_relevant_documents(query) 85 | 86 | # Build the final prompt 87 | final_prompt = get_contextual_rag_prompt(query, relevant_docs) 88 | 89 | # Call Groq LLaMA3 model 90 | response = client.chat.completions.create( 91 | model="llama3-8b-8192", 92 | messages=[ 93 | {"role": "system", "content": "You are an expert in 5G NRF log analysis."}, 94 | {"role": "user", "content": final_prompt} 95 | ] 96 | ) 97 | 98 | print("📍 Root Cause Analysis:\n") 99 | print(response.choices[0].message.content) 100 | 101 | 102 | Sample questions to ask 103 | 104 | qa_2 = "Which IPs accessed unauthorized endpoints?" 105 | qa_3 = "Summarize deletion attempts on NRF instances." 106 | qa_4 = "What fault patterns were caused by 5xx errors?" 107 | 108 | 109 | 110 | | Library | Purpose | 111 | | ----------------------------------- | -------------------------------------------------- | 112 | | `langchain` + `langchain-community` | Core RAG framework + integrations | 113 | | `chromadb` | In-memory vector store for fast document retrieval | 114 | | `sentence-transformers` | Embedding models for turning text into vectors | 115 | | `openai` | OpenAI client (compatible with Groq API) | 116 | | `gradio` | Lightweight chatbot UI framework for Colab or web | 117 | 118 | https://colab.research.google.com/#scrollTo=YAHWg0uyX8iN 119 | 120 | -------------------------------------------------------------------------------- /Chapter3/03_05.md: -------------------------------------------------------------------------------- 1 | Self Healing: 2 | 3 | Objective: Detect and Respond to Anomalous Behavior in 5G NRF Logs Using Machine Learning 4 | 5 | To detect rogue or misbehaving Network Functions (NFs) in a 5G network by analyzing logs generated by the Network Repository Function (NRF) and using machine learning to automate anomaly detection and simulate blocking of malicious IPs. 6 | 7 | Step 1 – Upload the NRF Logs 8 | ------------------------------------ 9 | You uploaded your NRF log file (nrf_bad.log) into Google Colab. 10 | 11 | from google.colab import files 12 | uploaded = files.upload() 13 | 14 | Think of this as giving the AI a diary of everything that happened in your network. 15 | 16 | Step 2 – Parse and Structure Logs 17 | ---------------------------------------- 18 | You extracted useful information (like IP address, HTTP method, endpoint, status code) from each log entry and turned it into a structured table (a DataFrame). 19 | 20 | This step transforms messy text into a clean spreadsheet for AI to understand. 21 | 22 | import re 23 | import pandas as pd 24 | 25 | log_filename = next(iter(uploaded)) 26 | structured_logs = [] 27 | 28 | with open(log_filename, 'r') as f: 29 | for line in f: 30 | match = re.match(r"(\d{4}-\d{2}-\d{2}T[^\s]+)\s\[INFO\]\[NRF\]\[GIN\]\s\|\s(\d{3})\s\|\s([\d\.]+)\s\|\s([A-Z]+)\s\|\s([^|]+)", line) 31 | if match: 32 | structured_logs.append({ 33 | "timestamp": match.group(1), 34 | "status_code": int(match.group(2)), 35 | "ip": match.group(3), 36 | "method": match.group(4), 37 | "endpoint": match.group(5).strip() 38 | }) 39 | 40 | df = pd.DataFrame(structured_logs) 41 | df.head() 42 | 43 | Step 3 – Create Features 44 | ------------------------------ 45 | You added "smart columns" that highlight potentially suspicious behavior, like: 46 | 47 | Was it a DELETE request? 48 | 49 | Did the endpoint contain words like fault or unauthorized? 50 | 51 | Was the response status code 500 or 403? 52 | 53 | These features are like red flags we want our model to watch for. 54 | 55 | # Create features 56 | df['is_fault'] = df['endpoint'].str.contains("fault|unauthorized|error", case=False).astype(int) 57 | df['is_delete'] = (df['method'] == 'DELETE').astype(int) 58 | df['status_5xx'] = (df['status_code'] >= 500).astype(int) 59 | df['status_4xx'] = ((df['status_code'] >= 400) & (df['status_code'] < 500)).astype(int) 60 | 61 | # Label known anomalies (manually for training) 62 | df['label'] = (df['is_fault'] | df['is_delete'] | df['status_5xx'] | df['status_4xx']).astype(int) 63 | 64 | # View dataset 65 | df[['ip', 'status_code', 'method', 'endpoint', 'is_fault', 'is_delete', 'status_5xx', 'label']] 66 | 67 | 68 | Step 4 – Train a Machine Learning Model 69 | -------------------------------------------- 70 | 71 | You trained a Random Forest Classifier — a popular model that learns from past data. 72 | We gave it examples of what’s “normal” vs. “anomalous”, based on our feature flags. 73 | 74 | The model learned patterns of bad behavior, like: 75 | 76 | Rogue NFs trying to delete other NFs 77 | 78 | Accessing unauthorized endpoints 79 | 80 | Generating error responses 81 | 82 | from sklearn.ensemble import RandomForestClassifier 83 | from sklearn.model_selection import train_test_split 84 | 85 | features = ['is_fault', 'is_delete', 'status_5xx', 'status_4xx'] 86 | X = df[features] 87 | y = df['label'] 88 | 89 | X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42) 90 | 91 | model = RandomForestClassifier(n_estimators=100, random_state=42) 92 | model.fit(X_train, y_train) 93 | 94 | print("✅ Model trained. Accuracy on test set:", model.score(X_test, y_test)) 95 | 96 | 97 | Step 5 – Predict Anomalies 98 | --------------------------------- 99 | You used the model to scan all the logs and flag IP addresses that looked suspicious. 100 | 101 | These IPs are now identified as potentially rogue NFs. 102 | 103 | df['anomaly_pred'] = model.predict(X) 104 | 105 | # Show only flagged IPs 106 | anomalies = df[df['anomaly_pred'] == 1] 107 | anomalous_ips = anomalies['ip'].unique() 108 | 109 | print("🚨 Anomalous IPs detected by the model:") 110 | for ip in anomalous_ips: 111 | print(f"🚫 {ip}") 112 | 113 | Step 6 – Simulate Blocking 114 | ----------------------------------------- 115 | 116 | You printed simulated firewall commands like: 117 | 118 | print("\n🔒 Simulating firewall block for flagged IPs:") 119 | for ip in anomalous_ips: 120 | print(f"iptables -A INPUT -s {ip} -j DROP # Simulated block") 121 | 122 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | LinkedIn Learning Exercise Files License Agreement 2 | ================================================== 3 | 4 | This License Agreement (the "Agreement") is a binding legal agreement 5 | between you (as an individual or entity, as applicable) and LinkedIn 6 | Corporation (“LinkedIn”). By downloading or using the LinkedIn Learning 7 | exercise files in this repository (“Licensed Materials”), you agree to 8 | be bound by the terms of this Agreement. If you do not agree to these 9 | terms, do not download or use the Licensed Materials. 10 | 11 | 1. License. 12 | - a. Subject to the terms of this Agreement, LinkedIn hereby grants LinkedIn 13 | members during their LinkedIn Learning subscription a non-exclusive, 14 | non-transferable copyright license, for internal use only, to 1) make a 15 | reasonable number of copies of the Licensed Materials, and 2) make 16 | derivative works of the Licensed Materials for the sole purpose of 17 | practicing skills taught in LinkedIn Learning courses. 18 | - b. Distribution. Unless otherwise noted in the Licensed Materials, subject 19 | to the terms of this Agreement, LinkedIn hereby grants LinkedIn members 20 | with a LinkedIn Learning subscription a non-exclusive, non-transferable 21 | copyright license to distribute the Licensed Materials, except the 22 | Licensed Materials may not be included in any product or service (or 23 | otherwise used) to instruct or educate others. 24 | 25 | 2. Restrictions and Intellectual Property. 26 | - a. You may not to use, modify, copy, make derivative works of, publish, 27 | distribute, rent, lease, sell, sublicense, assign or otherwise transfer the 28 | Licensed Materials, except as expressly set forth above in Section 1. 29 | - b. Linkedin (and its licensors) retains its intellectual property rights 30 | in the Licensed Materials. Except as expressly set forth in Section 1, 31 | LinkedIn grants no licenses. 32 | - c. You indemnify LinkedIn and its licensors and affiliates for i) any 33 | alleged infringement or misappropriation of any intellectual property rights 34 | of any third party based on modifications you make to the Licensed Materials, 35 | ii) any claims arising from your use or distribution of all or part of the 36 | Licensed Materials and iii) a breach of this Agreement. You will defend, hold 37 | harmless, and indemnify LinkedIn and its affiliates (and our and their 38 | respective employees, shareholders, and directors) from any claim or action 39 | brought by a third party, including all damages, liabilities, costs and 40 | expenses, including reasonable attorneys’ fees, to the extent resulting from, 41 | alleged to have resulted from, or in connection with: (a) your breach of your 42 | obligations herein; or (b) your use or distribution of any Licensed Materials. 43 | 44 | 3. Open source. This code may include open source software, which may be 45 | subject to other license terms as provided in the files. 46 | 47 | 4. Warranty Disclaimer. LINKEDIN PROVIDES THE LICENSED MATERIALS ON AN “AS IS” 48 | AND “AS AVAILABLE” BASIS. LINKEDIN MAKES NO REPRESENTATION OR WARRANTY, 49 | WHETHER EXPRESS OR IMPLIED, ABOUT THE LICENSED MATERIALS, INCLUDING ANY 50 | REPRESENTATION THAT THE LICENSED MATERIALS WILL BE FREE OF ERRORS, BUGS OR 51 | INTERRUPTIONS, OR THAT THE LICENSED MATERIALS ARE ACCURATE, COMPLETE OR 52 | OTHERWISE VALID. TO THE FULLEST EXTENT PERMITTED BY LAW, LINKEDIN AND ITS 53 | AFFILIATES DISCLAIM ANY IMPLIED OR STATUTORY WARRANTY OR CONDITION, INCLUDING 54 | ANY IMPLIED WARRANTY OR CONDITION OF MERCHANTABILITY OR FITNESS FOR A 55 | PARTICULAR PURPOSE, AVAILABILITY, SECURITY, TITLE AND/OR NON-INFRINGEMENT. 56 | YOUR USE OF THE LICENSED MATERIALS IS AT YOUR OWN DISCRETION AND RISK, AND 57 | YOU WILL BE SOLELY RESPONSIBLE FOR ANY DAMAGE THAT RESULTS FROM USE OF THE 58 | LICENSED MATERIALS TO YOUR COMPUTER SYSTEM OR LOSS OF DATA. NO ADVICE OR 59 | INFORMATION, WHETHER ORAL OR WRITTEN, OBTAINED BY YOU FROM US OR THROUGH OR 60 | FROM THE LICENSED MATERIALS WILL CREATE ANY WARRANTY OR CONDITION NOT 61 | EXPRESSLY STATED IN THESE TERMS. 62 | 63 | 5. Limitation of Liability. LINKEDIN SHALL NOT BE LIABLE FOR ANY INDIRECT, 64 | INCIDENTAL, SPECIAL, PUNITIVE, CONSEQUENTIAL OR EXEMPLARY DAMAGES, INCLUDING 65 | BUT NOT LIMITED TO, DAMAGES FOR LOSS OF PROFITS, GOODWILL, USE, DATA OR OTHER 66 | INTANGIBLE LOSSES . IN NO EVENT WILL LINKEDIN'S AGGREGATE LIABILITY TO YOU 67 | EXCEED $100. THIS LIMITATION OF LIABILITY SHALL: 68 | - i. APPLY REGARDLESS OF WHETHER (A) YOU BASE YOUR CLAIM ON CONTRACT, TORT, 69 | STATUTE, OR ANY OTHER LEGAL THEORY, (B) WE KNEW OR SHOULD HAVE KNOWN ABOUT 70 | THE POSSIBILITY OF SUCH DAMAGES, OR (C) THE LIMITED REMEDIES PROVIDED IN THIS 71 | SECTION FAIL OF THEIR ESSENTIAL PURPOSE; AND 72 | - ii. NOT APPLY TO ANY DAMAGE THAT LINKEDIN MAY CAUSE YOU INTENTIONALLY OR 73 | KNOWINGLY IN VIOLATION OF THESE TERMS OR APPLICABLE LAW, OR AS OTHERWISE 74 | MANDATED BY APPLICABLE LAW THAT CANNOT BE DISCLAIMED IN THESE TERMS. 75 | 76 | 6. Termination. This Agreement automatically terminates upon your breach of 77 | this Agreement or termination of your LinkedIn Learning subscription. On 78 | termination, all licenses granted under this Agreement will terminate 79 | immediately and you will delete the Licensed Materials. Sections 2-7 of this 80 | Agreement survive any termination of this Agreement. LinkedIn may discontinue 81 | the availability of some or all of the Licensed Materials at any time for any 82 | reason. 83 | 84 | 7. Miscellaneous. This Agreement will be governed by and construed in 85 | accordance with the laws of the State of California without regard to conflict 86 | of laws principles. The exclusive forum for any disputes arising out of or 87 | relating to this Agreement shall be an appropriate federal or state court 88 | sitting in the County of Santa Clara, State of California. If LinkedIn does 89 | not act to enforce a breach of this Agreement, that does not mean that 90 | LinkedIn has waived its right to enforce this Agreement. The Agreement does 91 | not create a partnership, agency relationship, or joint venture between the 92 | parties. Neither party has the power or authority to bind the other or to 93 | create any obligation or responsibility on behalf of the other. You may not, 94 | without LinkedIn’s prior written consent, assign or delegate any rights or 95 | obligations under these terms, including in connection with a change of 96 | control. Any purported assignment and delegation shall be ineffective. The 97 | Agreement shall bind and inure to the benefit of the parties, their respective 98 | successors and permitted assigns. If any provision of the Agreement is 99 | unenforceable, that provision will be modified to render it enforceable to the 100 | extent possible to give effect to the parties’ intentions and the remaining 101 | provisions will not be affected. This Agreement is the only agreement between 102 | you and LinkedIn regarding the Licensed Materials, and supersedes all prior 103 | agreements relating to the Licensed Materials. 104 | 105 | Last Updated: March 2019 106 | --------------------------------------------------------------------------------