Animation Using 3D Model Through Kinect Computer Science Essay

Published:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

The project is all about the animation and gaming experiences in the present and next generation. This project also gives you the future advancements ideas in 3d view of objects and 3d mapping. In the past years the usage of the cameras are for video recording, video chatting, taking pictures but now in this project you will experience different way of using the cameras and the their methods. In the animation movies, the character is modelled and they take lot of time to animate the character by frame to frame or by motion tracking suites which consume loads of time and money. So we need the easy technique which reduces the cost and time consumesion is the 3d motion animation through Microsoft Xbox 360 Kinect. This project allows you to create a home animation video, a teaching tutorial video and this all can be done by any human in the near future.

The requirements and methods required for the projects are discussed in the coming parts. All the techniques are will explained in details and in steps.

This chapter gives over view of all the requirements and methods used in the project. This project is implemented to reduce time taking and cost influence in the animation, gaming and movies. This coursework helps your 3d character created in any 3d studio to animate with the Microsoft kinect motion detecting system. The animation movements framed by you can be recorded in image or a movies formatting file, and for example these file can be used to demonstrate an event or teaching dancing steps or of any animation to the purpose it is.

1.2 Aim of the project:

The aim of the project is to animate a 3-D model using a motion capturing device, which is Microsoft Kinect with help of 3-D modelling software like Motion builder, 3d Max, Maya, LightWave 3d, sketch up pro and more.

1.3 Requirements:

This chapter deals with the required things that are required for designing a 3D character using Kinect. The main requirements of this project are listed below

Microsoft Xbox 360 Kinect.

Motion detecting software.

Software that helps to convert the detected motion into 3D object.

Character modelling is done in motion builder or 3D max which is the products of Autodesk.

Adapter which connects the kinect and pc to interact.

Drivers of kinect cameras.

1.4 Activates to undertake:

In order to achieve the project you should have good knowledge in 3d modelling software like 3d max, Maya, or motion builder. The motion detecting software must be installed in your pc which you can get it from link subscribed in the reference. Motion Builder software helps you to convert the detected motion from kinect to 3d objects imported from 3d modelling software's. Microsoft kinect hardware which cost around 70£-100£ and Adapter power cable cost around 6£ in ebay.co.uk. and finally install the drivers of the kinect hardware.

1.5 Outlines of the rest of thesis:

Chapter 2: Background

Chapter 3: Requirements

Chapter 4: Design

Chapter 5: Implementation

Chapter 6: Evaluation of the Software.

2. BACKGROUND:

2.1 Microsoft Kinect:

Microsoft was organised in the year 1975 by Bill Gates and Paul Allen and its success in many products like Windows 1.0, 2.0, 3.0, 95, 98, 2000, XP, Vista, 7, and Bing, Xbox 360, Kinect etc,. Kinect is a Microsoft 2009 product which is designed for the motion sensing gaming. The word kinect descrides the key aspects of the words 'kinetic' and 'connect'.

Kinect is basically a gaming device for Xbox 360 which is the fastest selling electronic device in history. Around 10 million units were sold since its launch in November 4th 2010. An average sale of 133,333 units per day has become the fastest selling device according to Guinness world record. These sales figures outstrip both with ipad and iPhone with equivalent period of launch. Kinect is not only used for gaming it is being widely enhanced in media, movies, sports, exercise classes, medicine, entertainment, teaching etc.,.

C:\Users\dell\Downloads\product_005_540x338.jpg

Figure : Microsoft kinect

Kinect has an optical setup that what it allows to track the movements, effects and function of the user. It is made up of two main parts, a projector and IR VGA camera. The IR sensor is a depth sensor combined with a monochrome CMOS sensor, which enables to capture 3D video data in any light effects. The adjustments can be made with depth sensor. The innovation of Kinect software enables advanced gesture, face, voice recognition. Kinect has a capability of taking multi inputs and capable of simultaneously tracking up to 6 people. Kinect creates a field of view by the camera and sensors and takes the movements, motion, and gesture recognition of those fields.

C:\Users\dell\Downloads\kinect-for-windows-img_0.jpg

Figure : Kinect details

Features of kinect:

Kinect sensor output video

frame rate of 30Hz.

RGB sensor

8-bit VGA resolution with a Bayer colour filter.

Monochrome depth sensor

11 bit VGA resolution with 2,048 levels of sensitivity.

Kinect sensor

ranging limit is 1.2 - 3.5m when used with the Xbox software.

sensors tracking extended range

approximately 0.7 - 6m.

Four microphone capsules are arranged in a microphone array and each channel has a processor of 16-bit audio operating at a rate of 16kHz

Sensors angular field of view of 57o horizontally and 43o vertically and capability of tilting the sensor is up to 27o either up or down.(13)

2.2 Autodesk 3D max:

3D Studio Max is 3D computer graphics software for professionally modelling object and rendering, which is used for animation, virtual effects, development & making game characters and 3D film effects. 3D max is published by Autodesk for DOS platform later republished for Windows NT and then change to 3Ds max

C:\Users\dell\Desktop\3ds max.png

Figure :3ds max logo

The important features of 3d studio max is MAXScrip, character studio, DWGimport general key framing, skinning, cameras, lightening , rendering, skeleton and inverse kinematics(IK)

Four phases of the 3D Studio MAX are- (1)

Modelling: modelling a character is based on images and imagination. In 3D MAX, there are many frame views such as top, left, front and perspective. Using these view frames, the model of the character is easy to characterize and to be built. The widely used modelling is low poly modelling technique. Too many polygons in the video game can be strain the making of the game. So when the polygons are reduced, it is easy to model and the rotation is also smooth. That is why it is known for ease in its modulation.

C:\Users\dell\Desktop\Art_max_impex_b2.jpg

Figure : modelling a human in 3ds max

Material mapping: It is an understanding of the way; people see the objects and the importance of the material and its appliance to the model, which is important for a good material view in the screen. It displays the results of the lightening effects on the material, 3D MAX provides various materials which can be edited and can be used according to the users choice.

C:\Users\dell\Desktop\livingroom_design_var1_pt_net.jpg

Figure material view in 3d max

Animation: Animation is a characterized model where the features can be user defined. It can be saved and modified using key frames. For example, when a ball is dropped from a height, the way ball drops on the ground(gravity) and the timing it takes the reach the ground of impact and its reflection is all taken into consideration and the animation is processed and is recorded in key frames. Animation is natural or fantasized way of designing a model.

Rendering: It falls in to many aspects like lighting, camera viewing and rendering the audience would prefer to see is rendering. The movements, the colouring references, the style and the material view can be easily rendered in 3D Max.

The 3d max is expended in many fields of industry like entertainment, gaming, mobile applications, teachings, movies and education.

2.3 Autodesk MotionBuilder 2012:

MotionBuilder is the product of Autodesk which was launched in the year of 2012.

It is the experience of editing models beyond audio and video virtual production. The real time 3D modelling animation software built on embracing the entertainment industries. It is the most successful character editing software used by many of the entertainment and animation companies. You can create, edit, and play highly complicated characters in interactive and high responsive environment. It also helps to import(acquire), export and refine data great precision and reliability.

C:\Users\dell\Downloads\images (1).jpg

Features of MotionBuilder.

It's one of the fast animation tool in industry of it 3D engine maker.

It enables live data and live objects to record in the disk.

Manipulation of motion capture data gives to characterise models and editing.

Smooth Interoperability.

2.4 Kinect Adapter:

Kinect sensor power supply provides power and connectivity to an original X360 console to make sure you can continue using your kinect anywhere, anytime. We use this adapter as a connection and communication between windows pc and kinect hardware without Xbox console. This adapter input voltage is : 100v-240v 50/60Hz and output voltage is : DC 12v … 1.08A.

C:\Users\dell\Downloads\Xbox-360-Kinect-Sensor-Power-Supply_1.jpgC:\Users\dell\Downloads\Power-Adapter-for-Xbox-360-Kinect--UK-72501310310.jpg

This adapter has two ports one female port for kinect and other male (or USB) port for the pc.

2.5 Kinect drivers for pc:

The drivers for the connect are programed by an open source web site (https://github.com/avin2/SensorKinect) github.com this drivers provides you to install the drivers for the 3 major parts of the kinect which are kinect audio, camera, and motion. This drivers helps windows pc to identify the kinect hardware and its camera, audio and motion (depth camera) and enhance them when we use to track the room and the user. This drivers hacks the kinect hardware and enables us to work with windows platform.

2.6 Brekel Kinect Software:

Bakerl kinect is software which gives the interface between the display and kinect sensors. It gives three display screens where one the normal camera, second the depth calculator camera and the user detector screen

3

1

2

This software provides you to adjust ,edit and record the movements in image format.

For the successful installing we need to install the 3 different sotware provided in the web sit (http://www.brekel.com/?page_id=160) one after the other which are

Brekel OpenNI Kinect auto installer- Developer editionv1.5.4.0.exe

Brekel Kinect setup v0.50.exe

Brekel Kinect MoBu Device v0.15.exe

The further details and installation will be explained in coming chapter.

The first requirement is the Microsoft Kinect device which is the main part of this project. Kinect cost around 70£-100£ in any gaming stores and in online shoping. This Kinect is a project that has changed the way we play games and watch television using kinect there is no need for joysticks or remote to play a game or to change a channel. This device requires no controllers (1)

3. REQUIREMENTS:

3.1 Explicit lists of requirements:

Microsoft Xbox 360 Kinect.

Adapter which connects the kinect and pc to interact.

Motion detecting software.

Character modelling software's motion builder and 3D max which is the products of Autodesk.

The first requirement is the Microsoft Kinect device which is the main part of this project. Kinect cost around 70£-100£ in any gaming stores and in online shoping. This Kinect is a project that has changed the way we play games and watch television using kinect there is no need for joysticks or remote to play a game or to change a channel. This device requires no controllers (1) our body is the control and by the movements according to instruction the system will respond to it as like changing channel, searching , volume control and more.

The main feature of the kinect is its 3 sensors ;

Kinect multi array mic sensor

Monochrome depth sensors

RGB sensor

Multi Array MIC: this sensor is advanced and accurate to the commands given by the user. This MIC is a arranged in series and on both sides of the kinect modem.

Depth sensor: this sensor evaluates the surrounding of the user and identifies the living and non-living objects in the room of activity. It has 11 bit VGA resolution with 2048 levels of sensitivities.

RGB sensor: basically this sensor is a camera with 8-bit VGA resolution with a bayer colour filter. This sensor gives the video conformation of the user and screens with RGB colours for human interface.

Kinect power adapter: this adapter is used to power up the kinect modem from outer source and it has to ports one for the kinect and the other for the pc. It also acts as the data transfer and communication between the Microsoft kinect and Operating system.

Motion detection software:

This software is used to detect the gestures and moments of the user in front of the Microsoft kinect. This software is known as brakel kinect which is an open source and free 30days trial kit. It hacks the kinect and provides us to interact with the windows pc. It provides 3 visual screen one for the normal eye, second to show the depth of the camera, and last tracking motion of user in form of dots and lines.

This software provides intimation when the user is tracked and when the user is lost. By using this software we record the movements of the user and implement them to the motion builder or 3d max or any modelling character software.

Brekel kinect software also provides you the additional features as given below.

Capture 3D objects

Capture dynamic point clouds

Capture motion to a BVH file

Stream a skeleton into MotionBuilder

Get a pointcloud into nuke

3d modelling software:

The 3D modelling software are used in this project are Motion building and 3d max studio. For this project you need to know in detail about any 3d modelling software like maya, 3d max, motion builder, Sketch UP pro, light wave and many other. This will help you to create a character, animate and render the output. When I started the projects 3d modelling learning consumed lot of time but learning the basic of the modelling improved my skills and helped me later.

As discussed above the 3d max and motion builder have similarities and dissimilarities will be discussed below.

A tree graph to represent the 3d max features:

A tree graph for showing features of motionbuilder.

4. DESIGN:

How to motor/LED driver installation

- Go to Windows Device Manager and locate your "Kinect Motor" driver usually under the "PrimeSensor" folder.

- Uninstall the driver

- Unplug the Kinect device

- You should now see an unrecognized Kinect Motor device - Right Mouse Button and choose "Update driver"

- Browse to the location where you installed "Brekel Kinect" there should be a folder named "motor driver" with subfolders for x86 (32bit OS) or x64 (64 bit OS)

How to - installation instructions

Quick and automatic method:

1. Completely remove any currently installed Kinect drivers (OpenNI, OpenKinect libreenect, Code Laboratories CL-NUI)

2. Install drivers using the All-in-one OpenNI Kinect Auto Installer.

3. Install the latest Brekel Kinect base application.

4. Optionally install the MotionBuilder plugins.

Manual method:

1. Completely remove any currently installed Kinect drivers (OpenNI, OpenKinect libreenect, Code Laboratories CL-NUI)

2. Download and install OpenNI

3. Download and install the Kinect drivers

4. Download and install NITE

Use this key during installation: 0KOIk2JeIBYClPWVnMoRKn5cdY4=

Your Windows Device Manager should look something like this:

C:\Users\dell\Downloads\device_manager..jpg

5. Try one of the OpenNI samples:

C:\Program Files (x86)\OpenNI\Samples\Bin\Release\NiSimpleViewer.exe

It should come up with a yellowish full screen stream of the depth sensor.

(Note: examples are only included in the Development Editions of OpenNI)

Go back to step 1 if this doesn't run.

6. Install the latest Brekel Kinect base application.

(Your firewall may ask for permission to open a port)

7. Optionally install the MotionBuilder plugins.

  How to - Capture 3D objects

1. On the top left under "Point Cloud"

- set 'Export Format' to the file type you want. Or none if you don't want to export the point cloud.

- set 'Draw/Export Mode' to either export points or a quad mesh. Quad meshes are exported with UV coordinates for easy texture mapping. (Note that not all 3D packages will be able to handle just points)

2. On the bottom left under "Color"

- Set 'Export Format' to the file type you want. Or none if you don't want to export colour images.

3. On the bottom right under "Depth"

- Set 'Export' Format to the file type you want. Or none if you don't want to export depth images (For regular use you may not need these).

4. In the middle under "Capture"

- Hit 'Browse' if you want to specify your own output folder. By default exported files are saved to a folder named "Brekel_Kinect_Captures" in your "My Documents" folder

- Hit the 'Capture Frame' button to export a single frame. Subsequent captured will automatically be numbered.

- Hit the 'Start Continuous Capture' button to keep capturing as fast as possible. Hit it again to stop capturing (Note that capture speed is dependant on the what your exporting and what file formats you're using).

    

how to - Capture dynamic point clouds

1. On the top left under "PointCloud"

- set 'Export Format' to "Binary". This is a custom format designed for speed, saving raw data.

2. On the bottom left under "Color"

- Set 'Export Format' to none if you don't need a texture. Which format to choose depends on your disk and cpu speed so you may have to experiment a little. I've had fast results with the BMP and TGA formats in uncompressed mode.

3. On the bottom right under "Depth"

- Set 'Export' Format to none as we don't need this and it will only slow down.

4. Under "Capture"

- Hit 'Browse' if you want to specify your own output folder. By default exported files are saved to a folder named "Brekel_Kinect_Captures" in your "My Documents" folder. Again make sure you select a fast drive, SSD's are excellent

- Hit the 'Start Continuous Capture' button to start capturing.

- Hit "Stop Continuous Capture" to stop recording.

- You should now have a bunch of .bin files in your capture folder.

- Close "Brekel Kinect"

5. Offline Scan Processor (from Windows Start Menu)

This is a separate program which can convert .bin files saved from Brekel Kinect into 3D objects.

The same formats are supported as the Brekel Kinect base application but doing this in a separate application

will allow you to record in (semi)realtime and process afterwards.

- Under "Output" select the geometry output format you want

- To convert a single file

    - hit 'Browse' next to "Input File" and browse to the .bin file you want to convert and hit 'Process' on the same line

    - the output object will be saved into the same folder

    - keep an eye on the console window for how long the conversion took

- To convert a whole folder of files

    - hit 'Browse' next to "Input Folder' and browse to a folder containing .bin files and hit 'Process' on the same line

    - all files will be converted one by one and waved in the same folder

    - keep an eye on the console window for which file we're at and how long the conversion took

How to - Capture motion to a BVH file

1. On the bottom right under "Depth"

- Make sure 'Nite User Tracking' is enabled.

- Stand in front of the kinect and make sure your entire body is visible.

- You should be recognised automatically which will be indicated by coloured pixels in the depthmap and an audio signal.

    When recognition fails:

      - make sure your face is visible

      - move around a bit

      - don't stand too close to other objects

- Tracking will now start to look for the Psi-Pose so please assume this pose.

- When the pose is recognized the tracker will start calibrating, this will typically take a few seconds to complete.

    When calibration fails:

      - make sure you stand still during calibration

      - go back to neutral pose and try again

      - make sure you're totally visible

      - "Visible Light" vs "Infra Red" can make a difference

      - lighting conditions in your room can make a difference

      - loose clothing or loose long hair may make it fail

      - too many other people in the view may make it fail

    

- Tracking should now start and you'll see an overlay of joints in the depth and pointcloud views.

    Note that points with low confidence (usually due to occlussion) will be coloured in red.

2. Under Capture

- Choose which naming convention you want HumanIK for working with MotionBuilder, or most other packages Biped for working with Biped in 3DMax

- Hit "Start Capture BVH" and perform the motions you want

- Hit "Stop Capture BVH" when done

- Hit "Open Capture Folder" to easily find the captured file

- Note that when you record again the capture file will be appended with a number so the previous one is not overwritten

3. Biped specific notes

When you're importing a BVH into 3DMax Biped, make sure that in the "Motion Capture Conversion Parameters" dialog

you set the "Limb Orientation" (bottom right of the window) for the Knees and Elbows to "angle", instead of the default "point"

The usual workflow for Bipeds is:

- create a fresh Biped

- import the .BVH file

- export as .BIP file

- load the .BIP file onto your Biped with your character attached

4. Blender specific notes

Make sure in the import settings you change the rotation setting to Quaternion instead of Euler for best results.

How to - Stream a skeleton into MotionBuilder

1. On the bottom right under "Depth"

- Make sure 'Nite User Tracking' is enabled.

- Stand in front of the kinect and make sure your entire body is visible.

- You should be recognised automatically which will be indicated by coloured pixels in the depthmap and an audio signal.

    When recognition fails:

      - make sure your face is visible

      - move around a bit

      - don't stand too close to other objects

- Tracking will now start to look for the Psi-Pose so please assume this pose.

- When the pose is recognized the tracker will start calibrating, this will typically take a few seconds to complete.

    When calibration fails:

      - make sure you stand still during calibration

      - go back to neutral pose and try again

      - make sure you're totally visible

      - "Visible Light" vs "Infra Red" can make a difference

      - lighting conditions in your room can make a difference

      - loose clothing or loose long hair may make it fail

      - too many other people in the view may make it fail

- Tracking should now start and you'll see an overlay of joints in the depth and pointcloud views.

    Note that points with low confidence (usually due to occlussion) will be coloured in red.

2. In MotionBuilder

- In MotionBuilder drag a new 'Brekel Kinect Device' into the scene.

- Point the 'Hostname/IP' to the IP of the machine that is running the 'Brekel Kinect 3D Scanner' application. If you're running everything on a single machine you can also use '127.0.0.1'

- Set the device's Online and Live states to ON.

- Under "Model binding" create a new skeleton (or point to a pre-existing one).

- If you want to record:

    - Set the device's Recording state to ON

    - Hit the Record button on the timeline

    - Hit the play butotn on the timeline   

- If you want to playback a recording:

    - Set the device's Live state to OFF

    - Hit play on the timeline   

- If you want to create a character node (for character retargeting):

    - Ask your actor/talent to assume a T-stance pose

    - Hit the "(Re)Create Character Nodes" button

    - A characterNode called "Kinect" will be created in your scene (an existing one will be re-created)   

- Uf you want to connect another character to the Kinect data:

    - On the characterNode that you want to be driven

        - Set "Input Type" to "Character Input"

        - Set "Input Source" to "Kinect"

        - Set "Active" to true

How to- Get a pointcloud into nuke

1. On the top left under "PointCloud"

- set 'Export Format' to the None as we'll be using images only

2. On the bottom left under "Color"

- Set 'Export Format' to a filetype readable by Nuke.

    "TIFF" should work.

3. On the bottom right under "Depth"

- Set 'Export' Format to "EXR RGBXYZ".

4. In the middle under "Capture"

- Hit the 'Capture Frame' button to export a single frame. By default exported files are saved to a folder named "Brekel_Kinect_Captures" in your "My Documents" folder

5. In The Foundry's Nuke

- Add the captured frames to a new script

- Add a "PositionToPoints" node to the script.

    - By default this node is a hidden node so do the following:

      - From the toolbar click the "Other" icon.

      - Click "All Plugins".

      - Click "Update".

      - Again click "Other" and "All Plugins" and notice how hidden nodes now show up.

      - You should now be able to add a "PositionToPoints" node to the script and continue.   

- Connect the "col" input of your "PositionToPoints" node to your color image.

- Connect the "xyz" input of your "PositionToPoints" node to your depth image.

- View your "PositionToPoints" node in the viewer.

    It should automatically switch from 2D to 3D mode.

    Zoom way out (pointclouds are big, a few hundred units in width/height, a few thousand in depth).

    And remember that by default the camera is on the ground so also move it up.

    

5. IMPLEMENTATION:

6. EVALUATION OF SOFTWARE:

7. CONCLUSION:

In today's fast growing world, there are many movies animators and gaming looking for a quick and quality solution. This project is designed and developed to make things easier for the animators and gamers but it is not a replacement for the previous methods and techniques. Even though this project helps you learn new skills and practice various techniques, but you still have to work hard in create an character model and animate with accurate motion detection.

This project explains why and how certain tools and software are used, you can also learn to handle the tasks, but also how to tackle future modelling and animation problems. In this project you will learn how to connect Microsoft kinect to Windows pc and able to detect the motion of the user and live connecting the motions of the user to character model through a hacking software Brekel kinect. We can use this kinect motion detector to composite into live action, we can add special effects. This motion animated characters can be used as presenting an architecture 3d models, or for teaching lesion for education, games and many more, and in future advancement we can use this motion detection for the blind people to detect things and hinting them many more can be done with motion sensing and animation together . You can do many more things with the kinect motion detector in animation like using the script operators or by computer languages , the capabilities of animation and motion detection are enhanced and uncharted. The best new feature and many good things should come for it.

During the project many difficulties where there but when you are focused all the problems will vanish. Time management is the important task and by different means some of the project plan timings where delayed but in the end all tasks are achieved goal on time with good success rate. I have learned many things during the project like time management , background check, mistake correction, retrying ,self-encouragement and more.

Even though there were some difficulties during the project made me strong and clear them later. Finally the project is very useful for me in terms of new skills and knowledge gained from process.

8. PROJECT EVALUATION:

8.1 Evaluation:

On this project when started there were many setbacks regarding the 3d model and data transfer between the pc and motion sensor kinect. But later no work double time on these aspects I was able to carry out successfully.

The aim of the project is to connect the motion detector kinect to and 3d model for animate models better, cheaper and easier than the other methods, the aim is very useful and realistic, guess this project will give the animators a quick relief in where work of animation field. This project will also help in the gaming design animation, teaching techniques can be enhanced and student's mistakes can be corrected by the detector some of the training programs like learning boxing, dancing and even excising or yoga. As in avatar, toy story, hulk the animation is the major part of the movie project, they use motion sensing dot suites for the actor (or actress) to tracking the movements and there gestures import this data to the avatar models, this will take lot of time to track, implement and some health problems are caused due to the motion detection dot suites. So, I prefer kinect animation is the better solution than above methods.

8.2 Suggestions for future work:

This project can be improvised in many levels like,

In this project the kinect sensor can detect and map one person at time, by improving kinect to detect multi users the compatibility increases.

There is some gibbering in the tracking the movements can be rectified by better and more advance motion detecting cameras.

As the Microsoft kinect as the detecting limit of approximately 0.7 - 6m range if the range is increased and the depth detection is enhanced the user motion and flexibility can be managed.

In an advance technology of motion detection, architecture can present, edit, create and even modify their work in 3d modelling and 3d virtual perspectives.

With the kinect motion and object detecting technique it can be used as path free sensing, object approach, help in hand for the blind people and automatic vehicles.

Writing Services

Essay Writing
Service

Find out how the very best essay writing service can help you accomplish more and achieve higher marks today.

Assignment Writing Service

From complicated assignments to tricky tasks, our experts can tackle virtually any question thrown at them.

Dissertation Writing Service

A dissertation (also known as a thesis or research project) is probably the most important piece of work for any student! From full dissertations to individual chapters, we’re on hand to support you.

Coursework Writing Service

Our expert qualified writers can help you get your coursework right first time, every time.

Dissertation Proposal Service

The first step to completing a dissertation is to create a proposal that talks about what you wish to do. Our experts can design suitable methodologies - perfect to help you get started with a dissertation.

Report Writing
Service

Reports for any audience. Perfectly structured, professionally written, and tailored to suit your exact requirements.

Essay Skeleton Answer Service

If you’re just looking for some help to get started on an essay, our outline service provides you with a perfect essay plan.

Marking & Proofreading Service

Not sure if your work is hitting the mark? Struggling to get feedback from your lecturer? Our premium marking service was created just for you - get the feedback you deserve now.

Exam Revision
Service

Exams can be one of the most stressful experiences you’ll ever have! Revision is key, and we’re here to help. With custom created revision notes and exam answers, you’ll never feel underprepared again.