What are NFC Tags?

NFC (Near Field Communication) Tags are tiny stickers, cards, and keychains that can store and send information to a phone when tapped. They don’t need to be charged because they get power from the phone when they come close together.

NFC tags are used for many things:

  • Quick Actions: Tap your phone on a Tag to turn on WiFi, open an app, or play music
  • Contactless Payments: Used in credit cards and Apple Pay/Google Pay to make purchases
  • Smart Business Cards: Tap a Tag to instantly share your contact information
  • Access Control: Used in keycards for doors and event passes to enter restricted areas
  • Home Automation: Used to turn on/off lights, adjust thermostat, control fans, etc.

How to add data to an NFC Tag:

  • Step 1: Purchase an NFC sticker, card, or keychain online
  • Step 2: Download an NFC app (e.g. NFC Tools)
  • Step 3: Choose what you want the Tag to do (share contact details, turn on WiFi, send message, etc.)
  • Step 4: Write the data to the Tag – Hold your phone close to the Tag, press “Write” and the phone will save the info onto the Tag
  • Step 5: Test it – Tap your phone on the Tag to see if it works

Posted in Uncategorized | Tagged , | Leave a comment

Speech Recognition Engine for Controlling RixBot

Finally, I managed to find some spare time today to connect up the audio system to a Raspberry Pi 4, which includes a simple speaker output and MAX9814 condenser mic input module. The purpose of this setup is to allow the voice recognition software (VOSK) to be installed on the Raspberry Pi, and respond to simple prompts. It took a huge amount of tweaking (and a lot of head scratching, cups of tea and chocolate) to finally resolve all the conflicts, but it seems to work, and the great thing about it is that it works offline.

The screenshot below shows a simple test that illustrates the software responding to voice prompts after I spoke several simple commands into the microphone:

I was impressed that the mic module would pick up on commands from 2 metres away, and did not make any mistakes.

The next stage of this work will be to attach the motor controller to the Raspberry Pi and install the whole module into the RixBot body. The body of the RixBot is based upon a RealRobots ‘Cybot’ body, which was originally sold as a kit through a weekly publication (see below). This is an excellent basis for a robot, as it already has geared motors, battery compartments and places for light and ultrasonic snesors built into it.

Once the Raspberry Pi has been mounted inside the robot body, and the motor controller installed, then it is ready for testing at Rix with our co-researchers.

Posted in Uncategorized | Leave a comment

Is Artificial Intelligence (AI) Inclusive? Workshop at Rix (12/02/2025)

Kate introducing the Generative AI workshop.

This is the second Generative AI workshop that we have hosted aimed at finding out more about the accessibility and usability of Generative AI software. The workshop developed ideas that were explored during our first workshop, presented as part of  ‘Creative inclusive research explorations of intelligent technologies/Artificial Intelligence (AI) at the University of Ravensbourne MeCCSA Symposium’

At the Symposium, delegates were working in diverse inclusive teams and given the chance to try out different AI software engines such as Stable Diffusion, in this instance hosted by NightCafé. In order for participants to have the chance of experiencing AI software, we gave no specific goal and requested people to experiment and enjoy using their imagination to see what could be created. For this second workshop, our co-researchers were given the task of generating images for a specific theme – health and wellbeing.

Researcher and co-researchers entering textual prompts to an AI engine.

Our co-researchers were all very keen and excited to learn and work with AI and see what they could create for themselves. We worked in teams of people with and without a learning difference and disability (LDD)  Along with gaining a better understanding of the inclusivity of AI software, a related focus was of a future where AI has a prominent role, and that our diverse team should be included in the development of this AI. Our workshop discussions started to showcase the potential advantages for use by people with LLD, and this is particularly relevant in light of the conversations regarding the ethics of AI and access to the services that it provides. 

Co-researcher entering a text prompt.

The participants at the workshop session were given the theme of Health and Wellbeing as we age, with the specific focus on healthy food, exercise and maintaining good mental health. They were asked to try and capture these three threads by generating images (including cartoon sequences) using the AI software and by providing the prompts, either through speech, text or by uploading a sketch they had drawn.

One of the images representing health and wellbeing generated by Nightcafe AI engine

Being able to talk to the AI seemed to work well for some who found it difficult to type, although some found that the dictation software (built into the Macbook) misunderstood what they wanted to say, possibly because the speech was not clear enough. Some of this was possibly due to the unique formation of sentences they spoke which could perhaps be improved upon with the use of an AI transcriber that could learn the idiosyncrasies of the speaker and transcribe their intentions more accurately. We haven’t yet found such software, so there is potential for a project that addresses this issue.

We discussed how AI could aid inclusion in many ways, including translating ideas, editing first thoughts, and create imagery to help share ideas and emotions. As an augmentation tool, the generative AI software engines are relevant in many areas, particularly in ideation and despite AI mis-understanding prompts, creating odd versions of reality, or just lacking something original. But no-doubt that as the software (and hardware) improves, so too will the abilities of the software to pre-empt what we really wanted to do and offer its own solutions.

The almost conversational approach with the AI was highlighted by one co-researcher when seeing the images produced by the AI software to create images showing healthy food, commented that the portions were huge – as illustrated below.

Gigantic portions generated by AI engine.
Image produced using the prompt "Create images for cartoon strip of somebody having a healthy meal". Note large portion sizes.

Another member of the team suggested you we need to provide clear prompts for the AI software – instructing it to create an image with smaller portion sizes. This was a common theme with the software – whichever method is used to generate the images – voice, text or image upload – you need to be quite clear and detailed about what it is that you require. Misunderstood words or keywords out of context could result in strange images being generated.

Everyone seemed delighted with the image results produced from interacting with the AI, despite some of the unusual and often unintentional consequences of the AI software, such as missing or additional limbs, or odd shaped physical objects (e.g. the dumbbells below).

Image generated using the prompt "Eating chicken curry with weights"

Keeping prompts simple and clear seemed to create straightforward, if not a little ordinary, images. Even when providing an obscure prompt such as “Generates an image of not eating pizza, but eating, healthy and small portions with lots of vegetables” produced something realistic, though not particularly out of the ordinary.

Image generated using the prompt "Generates an image of not eating pizza, but eating, healthy and small portions with lots of vegetables".

Some of the participants attempted to generate cartoon strips to illustrate health and wellbeing, but this did require quite a lot of effort and unless very clear and detailed instructions were provided, the results were rather simplistic, as illustrated below. As with many of the other images generated, these tended to be in the form of white, physically fit, young people. To be really inclusive and include images of people with disabilities, you would need to specifically ask for this.

Image generated using the prompt "Create a cartoon strip with someone doing arm exercises". Note the missing legs and additional limbs.

It was later suggested in a discussion following the workshop that visual prompts, such as physical or digital flashcards, could assist in giving the participants the cues that they were looking for in generating the images that they wanted to produce. To a small extent, some iconic prompts (image thumbnails) were provided by the Nightcafe interface for selecting the styles of image that could be generated.

Posted in Rix Inclusive Research | Tagged , , , , | Leave a comment

Talking Ruler (AKA RixTalk)

At a recent visit to the Google Accessibility Discovery Centre with colleagues from Rix Inclusive Research, amongst the many accessible items on display we saw a ruler that is available for people with sight impairments (they called it a ‘Braille Ruler’). The RNIB also sell an identical ruler to the one we saw at Google, but they call it a ‘tactile ruler’. Both have cutaways every 5mm to help you make measurements, and raised numbers and lines at intervals of 5mm. Here is the one from the Google Accessibility Centre:

Here is the same ruler from RNIB (looks 3D printed to me):

Not being content with a static ruler, I thought it might be interesting to enhance this a little and add some speech to the ruler. So I created an alternative version. The prototype below is the work in progress, which combines the tactile ruler with speech output using (for the prototype at least) an Arduino Uno, with an Adafruit MP3 player for the speech output:

The RixTalk Ruler has similar notches every 5mm, but larger notches every 10mm to help distinguish between them. The ruler is also only 15cm long, as it was easier to print the prototypes this way (having to produce many versions before I got the dimensions correct). Inside each ‘V’ slot is a contact wire, each of which connects to an analog port on an Arduino Uno. The contact wires are visible in the side view below:

The pencil (shown above) has a 3D printed cap that makes contact with the graphite core of the pencil, and is connected to the 5V port of the Arduino Uno. When a slot in the ruler is touched using the pencil, it makes contact with the Uno and speaks the measurement through the MP3 player and speaker. The prototype uses wires to the Arduino (just five of the thirty slots for testing purposes), which is a bit awkward, but necessary to test the idea out. However, it could be made into a single unit, battery operated with an integral speaker and bluetooth connection to the pencil. That will be the next development if we decide to continue with it.

Video of the prototype is below:

The next stage of development is to embed the electronics inside the ruler itself, with an integrated battery and speaker, and remove the need for wires by using a bluetooth connection to the pencil. We have also looked at the prospect of adding a thin base to the ruler so that in addition to having the slots at 5mm interval, you can also ruler straight lines. I’ll add an image of the new version as soon as it’s printed.

Posted in Uncategorized | Tagged , | 1 Comment

Oil Delivery from Rix

Took a photo of this today, as I liked what it said on the side of the lorry:

Posted in Uncategorized | 1 Comment

3D Printer Overhaul

The 3D printer that I originally bought around 2014 was a Prusa i3 MK2. I chose the self build, as I wanted to know how the printer was constructed, and the instruction provided by Prusa are superb. The MK2 was upgraded to a MK2.5 and then a MK2.5S. However, it recently began throwing up all sorts of errors, including MINTEMPs and thermal runaways. I figured that it could be a number of sensor errors, but when inspecting it, and moving it to the workbench, I managed to break some of the plastic parts. This was probably due to the age of the machine (though could have be partially down to my clumsy handling).

As I have had it for about 10 years, and still need to do quite a bit of 3D printing at home, I decided to buy a new MK4S so that I could continue with my 3D printed projects (and also reprint the plastic parts for the MK2.5S). The MK4S is a huge upgrade from the MK2.5S and has some extremely nice facilities, including network capability.

With the MK4S up and running, I thought it would be nice to overhaul the MK2S and get it back to a serviceable condition (more print time). So, I stripped it down and cleaned the frame and hardware. I also printed new plastic parts for it using the MK4S. Here it is in its current state:

The plastic parts printed really well, but I’m not happy with the bearings (which are the old ones after I upgraded to a MK2.5). So I have ordered a new set this week and will replace all of them.

I’m about 1/2 way through the rebuild now, and it is coming along well, and everything appears to be fitting together as it should. The frame structure is now together, with new bearings for the Y-axis. There were some issues getting the frame square and level, and it required a lot of adjustment before it allowed the carriage to move smoothly:

The Z axis has also been installed with the bars for the X carriage, and now it just needs mounting on the base. Once this is completed, then the extruder and wiring can be installed:

My MKIIS was finished just before Christmas 2024, after re-printing all the plastic parts (on my MK4S) replacing the bearings and belts, bed sensor, PINDA, hot end sensor and print plate. When it was first (re)assembled, the self-diagnostic reported that the frame was spot on – it has never done that before (it was always just slightly out of alignment). It runs smoother than before, but still has a slight issue with stringing the filament, which I am working on. I even upgraded the filament support to a MK3/4 version – sp much easier to use, and includes a filament guide.

(The bass on the left is my vintage Kramer DMZ4001 which I have had from new, having bought it in 1982. It has been modified, replacing the pickups with schaller split neck pickup and adding a schaller bridge pickup. It weighs a ton.)

Posted in Uncategorized | Leave a comment

Lighthouse Project

A recent commission was for a garden lighthouse. It had to be tall, and light up automatically when dark. This is the result, it is made from (mostly) 3mm ply which I hand sawed and assembled in my shed. It was first designed using a 3D modelling application, and then each piece converted to a 2D design so that some of the smaller parts could be laser cut (e.g. the windows and door). The roof is made from 3mm ply and then covered in copper foil. With time it should age nicely and turn green. The whole thing stands at just over 1m tall.

The lighthouse contains a very simple circuit to control the light (a single led inside the top part), a photoresistor and a solar cell to recharge the three AA batteries inside. The solar cell and photoresistor can be seen below, and on the far right you can just about see the little circuit board:

The whole thing took about 3 days to construct, then another few days to paint. I used exterior gloss for the white bits (after having first primed the wood) and then I sprayed the red with acrylic enamel.

If you would like one (similar or new design), then please contact me for details. Keep in mind they take a while to construct!

Posted in Uncategorized | Leave a comment

Update of Interactive Cow for MERL

Last week we added the ‘smell machine’ to the cow. This device is custom made for the cow, and circulates a single smell placed in a container inside the machine. The smell machine was 3D printed in parts and then assembled, rather like a model kit. You can see the container for the smell in the main image below – it’s the square orange thing sticking out of the side. When you remove it completely it opens up so that you can place a smell inside. It was designed so that a standard sized cotton wool pad could be soaked in a smell and placed in it.

The smell is circulated by a large computer fan (the big black thing), chosen for its fairly compact size but high air volume. When we tested it, the fan would move across the table by itself when on full power. There are also to servos on the smell machine which open and close a pair of butterfly valves. These are intended to keep the smell inside the machine when the fan is not blowing. Otherwise the smell will just come out by itself …

To control the smell machine, there is a small wooden control box containing a Pololu Micro maestro servo controller and a small circuit board with a MOSFET to switch the fan motor on and off. The Micro Maestro was chosen because it can be programmed to respond to input – it runs a simple script which waits for the button press, monitors the control for the number of seconds, controls the servos and sends the signal to the MOSFET. I can highly recommend them, and the scripting language is easy (a bit like FORTH, if you have ever used that).

On the front panel there is a power-on light, a green push button and a rotary control to select the number of seconds for the fan to blow. The operation is simple: select number of second for fan, and then press the button. The valves in the smell machine will open, the fan will blow for the set number of seconds, and then the valves will close.

Whilst the cow was resting on the sofa (see below) we added the smell machine underneath (not shown here) and also tied up all the loose wires, for which there are many. The next job is to add a power connector for the mains and also an on/off switch.

Next week we will be trying out the cow with a group of students from Reading college, so I hope all goes well.

Posted in Uncategorized | Leave a comment

Motorised Route Roller

A project I have been working on lately is motorising a ‘route roller’, which is a device which allows you to carry and view an A4 route sheet when on your classic motorcycle. Here is the image of the version sold by the VMCC:

0004414_route-roller_480

I found the device a little clumsy to use, because you often have to let go of the handlebar and direct your attention to twiddling the knob at the side to scroll the map. I figured that there had to be a better way which would be quicker and would be less distracting, so decided to motorise one.

The idea was to build an extension to the existing route roller so that you could just add the motor unit with little modification. This first version uses two laser cut brackets from 3mm acrylic, two 3D printed plugs that fit into the end of the rollers and hold the two axles, three 3D printed spur gears, and a 3D printed cover. The unit is driven by a cheap stepper motor controlled by an ATTiny microcontroller and a ULN2003 darlington array.

Here are some photos of the unit:

 

Here’s a (pretty crappy) video of the unit working:

I’ll make the plans and circuit available on this blog when I have perfected the device a bit more and given it a good road test. But if you are interested in making one for yourself and trying it out, then email me and I can send you the necessary details.

Here is the Fritzing circuit for the motorized route map:

StripBoard_bb

I used an ATTiny 85 microcontroller to send the signals from the switches to the ULN2003A (Darlington array) not because it was easier than creating a simple transistor circuit (it wasn’t) but because I have a box full of them, and they need using! Anyway, it’s kinda nice to use an ATTiny, as it opens up possibilities for mods later on (e.g. tap/hold the buttons to change speed).

The components needed for this crude (but simple design) are:

2 x 12mm push buttons

1 x ATTiny85 Microcontroller

2 x 10uF caps (for power supply – a bit overkill)

1 x 5V voltage regulator (to power the ATTiny)

2 x 10K resistors, 1/4 W

1 x ULN2003A Darlington array

12V stepper motor

Wire of various colours.

Stripboard

Note: the stripboard shown in the Fritzing circuit is just for illustrating the circuit, I actually used less than half of the full 9 x 25 board so that it would fit in the small space available.

The stepper motor I originally used was a cheapo 5V (and the one that you can see working in the video) but it was very underpowered and would regularly jam and stop working. So I upgraded it to a cheapo 12V, which was slightly better, but not perfect. See below:

Stepper

The ATTiny was programmed using an Arduino Uno as a programmer. The Arduino code for the ATTiny is here:

/*
* 
* IN1 = pin 2
* IN2 = pin 3
* IN3 = pin 4
* IN4 = pin 5
* 
*/


#include <Stepper.h>

/*-----( Declare Constants, Pin Numbers )-----*/
//---( Number of steps per revolution of INTERNAL motor in 4-step mode )---
#define STEPS_PER_MOTOR_REVOLUTION 32

//---( Steps per OUTPUT SHAFT of gear reduction )---
#define STEPS_PER_OUTPUT_REVOLUTION 32 * 128 //2048

// Button for scrolling up and down
#define UPBTN 1
#define DOWNBTN 5

//The pin connections need to be 4 pins connected
// to Motor Driver In1, In2, In3, In4 and then the pins entered
// here in the sequence 1-3-2-4 for proper sequencing
Stepper small_stepper(STEPS_PER_MOTOR_REVOLUTION, 2, 3, 4, 0);

int Steps2Take;

void setup() {
  pinMode(UPBTN, INPUT);
  pinMode(DOWNBTN, INPUT);
  small_stepper.setSpeed(700);
}

void loop() {

  // Get thevalue form the up button
  int up = digitalRead(UPBTN);
  int down = digitalRead(DOWNBTN);

  if (up == 1 && down == 0) {
    small_stepper.step(STEPS_PER_MOTOR_REVOLUTION);
  }

  if (down == 1 && up == 0){
    small_stepper.step(-STEPS_PER_MOTOR_REVOLUTION);
  }

}



 

I will have a look around for the mechanical drawings. I have done many different designs, using lots of different stepper motors and plain electric motors, so will take a bit of time to locate the one above. I upgraded the design later on to use a plain geared 12V electric motor, as it was more powerful and simpler to design.

 

 

Posted in Uncategorized | 3 Comments

Interactive Cow for MERL

The latest project I am working on with Kate is the interactive cow for MERL. As part of their rehang, they commissioned the building of an interactive cow which can be used by visitors. The cow will be capable of recording and playing back sounds, produces smells on demand and also will have removable textured ‘skins’ that can be attached to it.

The full-sized friesian cow arrived about 3 weeks ago, but we have yet to name her:

DSCN4676

Perhaps we should hold a competition for a name?

DSCN4679

Last week we started construction by cutting out part of the base that the cow will stand on, and also had to make a small hole in the belly to insert the magnets to hold the removable textures, and also to fit the electronics in when they are ready. We started by lying her on the sofa – perfect for minor operations:

DSCN4684

Using a Dremel and a small cutting wheel, we cut out a small section – just large enough to get one’s arm inside! When the cow is upright you cannot see the hole.

DSCN4687

The next job to do is to finish the base by adding braces, wheels and the sides, covering with faux grass and bolting on the cow. Once we have her rolling, then we’ll add some electronics.

Posted in Uncategorized | Leave a comment