On Demand | Integrating Behavioral and Physiological Data

Integrating The Observer XT and AcqKnowledge

BIOPAC citations are continually updated—current count is 50,900, search publications here.

0:15

We have a great speaker panel for you today. We’re looking forward to sharing these technologies, and the integration.

1:21

I’m really excited to introduce Alex Dimov … and Patrick Zimmerman from Noldus and Alex’s from BIOPAC of course. Patrick is a biologist and trainer in behavior. Research consultant at Noldus IT project has been with us for over 15 years. He has extensive experience in the study of human behavior and psychology. Welcome, Patrick.

1:47

And Alex is a product expert here at BIOPAC is also Head of Sales for Europe. And he has conducted over 100 seminars and hands-on workshops for BIOPAC over the years. Welcome, Alex!

2:01

All right, let’s start with you, Alex. I’m making you presenter.

2:18

I want to give you a taste of what we’re going to be covering today. So we’ll show all the details during the course of the webinar.

2:27

But since we’re talking about integration between The Observer and AcqKnowledge, let’s just go and demonstrate that.

2:34

So right now I’m connected to a physiological sensors I have a wireless unit right here that records electro dermal activity from The Observer on the left.

2:46

Oh, initiate the recording in AcqKnowledge here on the right.

2:51

Both are set up on the same computer at the moment.

2:55

And that’s just the oldest code.

2:58

There’s a little bit on the top, we have my data from a time cold signal. That’s coming from The Observer.

3:09

So, that allows us to synchronize the systems very well. And on the bottom we have the signal from the electro dermal activity. So let’s Maybes, care it a little bit that way. And I’ll go ahead and take deep breaths.

3:29

And let go.

3:30

So, now we can see a fairly large skin conductance response here, we can enter some markers.

3:40

And when we’re done, we can finish the experiment.

3:42

And the data will go back to The Observer. That’s that’s just a very quick overview of what’s going to happen today, what we’re going to show you how to do, but bear with us, as we explain to you how this all can happen.

4:01

And now, let’s switch over to Patrick.

4:09

Thanks, Alex.

4:11

There you go. Thank you very much, Brenda.

4:18

So, I think it’s good to start explaining and talk a bit more about why you would like you would want to acquire behavioral and physiological data because that’s that’s the topic of this of this webinar.

4:33

Measuring and studying behavioral and physiological data is an objective way to indirectly get access to someone’s psychological state.

4:44

So, measuring different types of signals on the outsides, looking at the behavior, but also on the inside and looking at the physiology, it gives you more insight into a participant’s response to the test stimuli or the test environments you expose them to.

5:02

And depending on the, on the type of test, participants, do not always show a clear, behavioral response, behavioral response could be a change in body language or facial expression, for example.

5:15

Therefore, it’s good to also include measuring physiological physiological signals.

5:23

Of course, philbert verbal reports or questionnaires give you direct access to the response of the participants, but these are subjective and colored by what participants think you want you want to hear and often participants don’t remember all the things they experienced during a test.

5:42

So it’s good to be able to Julia Test to record the behavior and also the physiological responses that become visible at the moment that stimuli are presented.

5:58

So, behavioral and physiological measurements show you how participants respond during the tests from one stimulus to the next.

6:06

So, that was the short introduction about why, acquire physiological and behavioral responses. Back to you, Brenda.

7:33

Join because of you guys, the speakers, so, awesome, thank you all for participating. Back to you, Patrick.

7:42

Thank you, Brenda. So, let’s proceed with showing you how to set up your system.

7:49

So this, this schematic drawing shows an example of how the two systems are can be connected.

7:58

So, on one computer, we run The Observer and the …

8:04

server, so the endings, server and agent are responsible for connecting, kind of connecting to the two systems. So The Observer X, T and BIOPAC AcqKnowledge.

8:17

So by having the server on The Observer XT computer and the endings agent and the AcqKnowledge control program, we enable The Observer to communicate with, AcqKnowledge and start and stop data acquisition.

8:36

As Alex already pointed out, he currently has The Observer XT and AcqKnowledge running on one computer, which is possible.

8:45

It’s also possible to run them on separate computers, connecting through, connected through a local network.

 

8:54

So, by setting, setting up your system this way, the …

8:59

protocol allows you to start and stop data acquisition and AcqKnowledge from The Observer X, T And optionally, as Alex already demonstrated, you can send a time coded signal synchronization signal from the serial port of The Observer computer to the MP 160 system to allow for even more accurate synchronization of behavioral and physiological data up to one milliseconds.

9:30

And that’s data acquisition. The AcqKnowledge data, the raw AcqKnowledge data is important automatically into The Observer XT.

9:41

So what is the kind of procedure of working with the two programs and setting up your connection between the two? So, first you start observing … and AcqKnowledge.

9:54

And if you already installed the endings, server, and agent, they are stopped started automatically when you start the computer. So you don’t have to worry about that.

10:05

Then, the next step is to set up The Observer XT, to connect to AcqKnowledge, which I will show you in a minute.

10:11

And after that, I will, Alex will show you how to set up AcqKnowledge, to be able to receive kind of the trigger from The Observer and the time coded signal.

10:26

So, let me switch to the observed XT, which is here.

10:31

I’m going to start a new project, Observer XT, AcqKnowledge demo.

10:40

OK, so, first what I want to show you is, if you start The Observer, if you installed the end link server, an agent, and you’ll want to activate it for the first time, you go in The Observer, He goes to file, references, and links settings, and make sure that you have this option selected.

11:00

So I can test the connection, it’s connected, that’s all fine.

11:04

And now I can proceed to the project setup to specifically connect to BIOPAC AcqKnowledge.

11:13

OK, I go to live observation.

11:15

I’m going to do a live test.

11:21

And I select, BIOPAC AcqKnowledge, proxy enabled, and enabled running. So that’s important. It should be enabled and running.

11:30

And if you want to send a time coded signal to AcqKnowledge, you’ll make sure you select a comport that’s currently active.

11:42

And he selected time code, TCAS Signal, and make sure that the that, the sample rate that you put in here.

11:52

It’s the same is a set acquisition separate of the in AcqKnowledge, OK.

12:00

Well, I don’t have a signal connected right now.

12:04

OK.

12:07

And before I can I start a test?

12:11

I can optionally already define some markers in my coding scheme in The Observer.

12:17

So, this is where you can enter markers, events that you’ll want to score during the test, or you can first acquire all your data and then, afterwards, do your manual manual coding.

12:32

So, let me just give you an example of how I can add a behavior group for, for example, if I want to.

12:40

Manually code different phases in my in my tests.

12:44

So I click add behavior group.

12:46

Group name is.

12:48

Thesis and I want to code the different phases in my During my test phase one, phase two, Phase three, and you can see here that these all have duration. So, I will be able to, to calculate the frequency and duration of each phase.

13:11

And if I want to add another, let’s say just a marker, which with no duration, I create, I select no and then I will be able to just count the frequency of the marker that I manually code it.

13:28

OK, well, and now we’re ready to, on the up server side, we’re ready to, to connect to AcqKnowledge, and Alex will now take over to, to show you how things are set up in the AcqKnowledge side.

13:46

So, back to you, Alex.

13:50

OK, thanks Patrick.

13:52

Superior, we are back in AcqKnowledge, and we’re gonna go from from scratch.

13:59

We’re just starting up the application, right, and I’ll show you the steps for setting up AcqKnowledge from the very beginning. First of all before we even do that.

14:09

I will need to instrument myself and in this case I’ve placed two electrodes for electro dermal activity, like those electrodes right here.

14:19

Connected to the BN-PPGED that’s the wireless transmitter that can record both in the electro dermal activity.

14:29

A very straightforward, I could have also use the middle financials of the fingers, but it’s a typical placement site as well.

14:37

So, now that the participant is set up, we’ll go ahead and create a new experiment, and typically, you would just use a template. I have already created a template, but let’s just do this from the beginning.

14:53

Now, you get this screen which gives you options for smart amplifiers. These are the new newer generation of why are the amplifiers.

15:02

We can also have analog channels which include this module right here, the wireless module and then we have digital and calculation channels.

15:11

So, first of all, I want to make sure that I’m recording the synchronization posts that will be coming from The Observer and I’ll just go ahead and enable Channel to because I have connected this to channel two already.

15:28

And we can see here how this is physically setup. So, from the USB port of the computer, I’m going to a serial adapter that connects the keyboard that goes to the input isolation adapter which is very important for safety reasons, so we don’t want any electrical connection between our system and anything else, And then it feeds into channel, too.

15:54

The AMI 100 di maggio, and I hear, all I’m saying, is, Look, we have something connected to Channel two, and we’re going to call this our sync channel.

16:07

Then I also have the POWs and EDA Module and I can use a dialog to bring that in.

16:16

We just look for the product code, it’s PPGED-R.

16:22

and I have to make sure that the settings here match what our physical settings on the amplifier. So I’m just looking at my amplifier right now. Sorry, my receiver and I’m not using the post channel and this is how I have set up my … channel.

16:39

So now, the software will know what to do with signal, and we click OK, it’s prompting me to check that it’s working, it’s working, and there is an initial calibration. That’s a zero point calibration, it ensures that if there’s any offsets in the measurement, we’re going to take care of itself for this.

16:59

I will actually disconnect the leads like this and disconnecting them from the electrodes. and I’ll click on Calibrate.

17:07

And now we’ve done that. We continue.

17:11

I’ll connect myself again.

17:14

Give me a seconds.

17:16

You don’t have to do this every time, in fact, the equipment’s quite steady over time. So, once you do this, you know, can probably go like weeks on end, without doing it again. So I can come over here in the software, that we have this green check box.

17:33

And I will no longer have this calibration procedure performed in the beginning. That’s it.

17:41

OK, well, now I can save this as a graph template, and you can see, I’ve already been saving a bunch of these, so this will be the fourth one.

17:53

When you save it as a graph template, next time you want to do this, you can just double click that template file, and you’re good to go.

18:01

So you don’t have to do anything, like where it is right now. And one more thing. Here in this setup, we can also control the acquisition sample rate.

18:12

As Patrick noted, it’s important to know the acquisition sample rate that’s used in the college.

18:19

This is shown at a time, code signal can be sampled appropriately, OK.

18:25

So here we make sure that it is what we want it to be, if we want to, we can also make some configurations in our own Event Marketing System, which can be useful for any sort of live interventions or events that occur, Phoenix Knowledge, OK, So all of this is set up now, and one more thing, we want to do in Depth Knowledge side, and I’ll click here to bring up the Preferences, and you wanted to do this before you even begin.

18:59

Go to the Networking tab, make sure that the Network data transfer is enabled. Now, the network data transfer is a license feature. So if you haven’t purchased this at the time when you acquire your buyback system, then this is something you can initiate additionally purchase and then it will become active.

19:20

Once you enable it, it will become active once you restart the software.

19:25

You can see what license features you have in the software.

19:29

If you go to Help, about, AcqKnowledge, and click on License Features.

19:35

I have a lot of them, but then from data transfer right here is one of them.

19:42

OK, well, at this point, we’re good to go, and let’s jump back to The Observer, and we’ll start a new measurement.

19:54

I’m essentially continuing from where Patrick left off. In his setup, we’re running. Exactly identical settings here. So, we set up a new measurement now and we just initiated from The Observer.

20:11

I will alt tab here to AcqKnowledge OK and change the timescale a little bit, NaN, for instance.

20:22

And also change the scaling on the channel. Make sure we’re looking into EDA channel to be adaptive scaling.

20:30

This is very useful because if we’re doing adaptive scaling, we can say no shows no less than one micro Siemens, but then automatically zoom in on the data.

20:40

And this just makes it easy to follow what’s going on. So I will go ahead and take a deep breath again.

20:50

And we get a very big, response here that’s just causes a very strong reaction.

20:56

So yeah, and at that point, we can also be entering some markers, like, I mentioned before, so, by using the function keys.

21:08

It’s a trap, and I’m getting a warning that my network is slow, so I’m going to turn off the webcam.

21:18

So, hopefully, everybody’s able to see what I’m doing. All right, so.

21:30

OK, and then from The Observer, will go ahead and stop the observation.

21:40

And now we will see data come into The Observer. It will take a minute and then we will see the data appear right here.

21:48

So we have two channels just like they looked and AcqKnowledge.

21:53

We have the time called Channel. And we have the data from the EDA.

22:17

And then while the poll question is going, Steve had a question about, if this will work with the MP 36, are on the 36th are?

22:29

Yes, I mean, really there is a way to do that with AB 36 are as well. So if you want to feed in a time sync signal, you can also connected to the MP 26 R, it just requires like an adapter that we sell for that as well.

22:47

So it totally can be done.

22:49

And, yeah, OK. So it can be done, but it requires a little bit different setup. So, all, we are just a little bit different, but we, I think, we do have users that are using it in that context, and like, just very minor difference in how we set it up.

23:27

Yeah, let’s see, how are we going to Patrick now?

23:37

Well, thanks, Alex, for showing the integration. What I want to show you now is the prerecorded data, or the project, that Alex already mentioned.

23:49

So, let me switch to The Observer.

23:53

So, what we’re looking at here is The Observer with the kind of the visualization of the test, the observation we did, and what we did here, I was a trailer on the monitor here, the Trailer of dune.

24:11

While, if you can see here, it’s a bit small while I was equipped with two ETA electrodes, and PPG sensors to measure the pulse in my finger.

24:27

And besides physiological data, we also record a video.

24:33

So we capture to screen that I was looking at using also using eye tracking and we had two cameras, so one overview camera and one camera kept treating my face. Based on which we could analyze facial expressions.

24:53

So, both the video recording software that’s that’s one of our tools media called Media Recorder and buy it back, AcqKnowledge what triggered through the endings protocol that we already explained.

25:10

So, I click Start Observation in The Observer, Biopic AcqKnowledge started acquiring the ADA and the PPG Signal.

25:19

The cameras started recording.

25:22

The screen recording was done.

25:25

And I will I started watching the trailer on the screen.

25:30

So once the Taylor had finished, we stopped the recording. And then the buyback data was automatically imported.

25:39

So we have the raw the tonic Eida signal here.

25:43

We have the, to calculate its heart rate signal, which was calculated real-time using a calculation channel and AcqKnowledge what’s important as odd rates.

25:57

And, after all this data was imported, I did some manual coding, so I already showed the behavior group with the four faces in the, in the trailer, and I manually coded them. So we have the four consecutive faces here.

26:15

And I also manually coded Certains, a specific scenes, like the scenes showing some fighting and some love scenes.

26:24

And I also added some markers for moments where, like, jumps care moments, where there was a sudden sound or change in the scene Or, that was some, a couple of screens in.

26:41

in the trailer.

26:43

So, this is the manually coded events, and if you look closely, if I zoom in on the fish data, can see, when I move, that, move the physiological data on the timeline, everything moves on the same timeline.

27:01

So, if I play the videos then everything will play synchronously and simultaneously And what we see here at the start, there was a a two second delay or offset between the star of The Observer and the and AcqKnowledge.

27:19

So, even if we have an offset using the …

27:24

protocol and the Time Cotes Signal, it ensures that the physiological data, after import into The Observer is put exactly at the right spot, synchronous with the with the start, know, the start of The Observer and the start of AcqKnowledge are synchronized. So, even if there is a delay. This is taken into account when the data is imported into The Observer.

27:54

So Henning, both the fish logical and the behavioral data in an in, in, in one Project.

28:05

It allows us to not only look at the analyze the behavior even more detail.

28:13

So, if I go to behavior analysis, for example, I can see if I pull up the mean duration until the duration as well.

28:24

I can vary accurately, display and analyze the different faces, know how many times the occurred and the total duration of each face.

28:36

Also for the fighting and the love scenes, I can see how many times they occurred, how many times they were coded, the total time day, the total duration and the mean duration of each behavior or event that was manually coded.

28:56

What we also can do, which we want to look at the combination of behaviors, events and physiological data.

29:06

For example, if I want to analyze the the mean value of the tonic EDA signal or the mean high rates in the raw heart rate, signal importers from from AcqKnowledge.

29:23

I can analyze them in different segments for example, corresponding to the different phases that I coded.

29:32

So, let us say that I want to know the average EDA value and the average heart rate during each of these four phases.

29:42

So, I can do that very easily in a data profile in The Observer.

29:47

So, if I open it, I’ve already made a selection, so I creates segments or intervals as It’s called in The Observer for each face separately, so, I kind of separate my data into four different segments.

30:05

If I now go to Visualization, you can very clearly see what, what the result is.

30:14

So you can see here that now the data is segmented into four intervals, and everything that’s kind of here, that’s kind of create out or blue doubt is not taken into account in a visualization and an analysis, but only the parts corresponding to the four different phases are now selected.

30:38

So if I now go to the Numerical Analysis, because that’s where you need to go, if you want to do some calculations on the physiological signals, you can see that for EDA and heart rates I showed the results for the maximum value and a mean value for EDA and heart rate.

30:59

And then now it’s it’s calculate these 4 to 4 faces separately.

31:04

So, we can see here that on average, the EAA was in phase 1 was 10.8 and it gradually decreased over time.

31:15

Whereas the heart rate, which is quite high for resting heart rate and I think this is why I definitely know that this is a result of the trailer being quite exciting.

31:26

We can see that in phase three, the average heart rate was, on average, higher, two beats per minute higher than any other faces. And I think this is the result of some jump scatter movements in Phase three.

31:41

Well, this is just a quick overview of what you can do, when you have a project, in which you integrated and synchronize behavioral, and physiological data.

31:56

And, in this case, physiological data from BIOPAC AcqKnowledge, If you want to know more about, you know, The Observer, how to work with it, we have a number of webinars on our YouTube channel that are worth checking out, if you want to know more about The Observer.

32:28

And while we have the poll going, I do have a quick question. I’m not sure which one of you this is for, but Madison asks, Do the event markers and AcqKnowledge get imported into The Observer at the end of the data collection?

32:42

So, I’ll go ahead and answer that. You can explicitly export these markers, and then it can be imported into The Observer.

32:54

OK, so, after you’re finished with the recording, you can export them as a file, and then you can import them into The Observer, and they don’t get Broadcast them over the network if that’s the question.

33:35

We have almost an equal amount of people responding to ECG, EDA, pulse and respiration, pulse is really high and then slightly less group of people doing facial EMG.

33:49

So thank you and Alex, I believe you are up now.

33:54

Yes.

34:00

Webcam off, right, so just keeping the webcam off just in case. All right, thanks Patrick and thanks Brenda. So here we’re now going to go to the other side and look at what sort of analysis we can perform and AcqKnowledge.

34:18

Using the markers that have been already configured and recorded in The Observer, the data file that I’m looking at was sent to me by Patrick. We’re working on exactly the same data set.

34:32

I’m just going to turn on some tools here so I can visualize data a bit better.

34:36

So we have post on the top, electro dermal activity, the time code signal, and online calculation of pulse rate.

34:47

So first of all, I will perform a little bit of filtering and data cleanup, even though this is extremely clean data. Just so we can go through the motions like what we would usually do.

35:02

Starting widths the post data as it says here on the top, we’ll go ahead and apply Digital Filter.

35:11

Fae are Band Pass between zero points, five and the hermits.

35:20

Since we are only interested in this case, in finding out post rate, we just want to remove any of the slow frequencies in the data that could be due to, for example, spirited reactivity. And then also, we’re cleaning up anything above three hertz which us, in this context is essentially noise. What happened is the signal is centered around zero.

35:44

That makes it very easy to detect each cycle each post cycle. So we can obtain the heart rate without any trouble by using a fixed thresholds. In fact, let’s do that.

35:58

We’ll go to Analysis and then find Rate.

36:04

And we’ll just use a fixed threshold at zero and now we have heart rate as well.

36:10

Of course, we already had heartrate calculated online. But sometimes you get artifacts during the online calculation.

36:18

And this is a slightly better way of doing it, OK? So let’s just hide the channel and let’s hide the post channel. We no longer need to look in that. Mean, if you want to see what it looks like, well, that’s the ball signal.

36:32

So hold down the alt key, Click on the channel and there we have it: in fact, while I’m at it hiding channels or hide the time called Signal.

36:43

So now we can concentrate on the EEA mousing over the label. I can see that the sample rate is 500 hertz.

36:52

That’s what was used for this recording or electro dermal activity tests a lot of information. We don’t need that sort of sampling rate, So our re sample it.

37:03

And the reason why I’m re sampling it is so that I can speed up any sort of data processing that I will do next.

37:12

If you’re around 50 samples or more, so in this case, 62.5, that’s quite sufficient. So we’ll click OK.

37:21

OK, and next, we can also if we had any sort of artifacts in here, for instance, quick artifacts from the participant shifting their fingers or maybe picking up the electrodes, the wire is pumping against something. Those show up is very sharp spikes in the data with extremely fast duration, totally non physiological.

37:46

And we can go to transform smoothing and use a median smoothing filter equal to the number of samples in a second, in this case, that’s 62.5. So I’ll use 62.

38:01

Click OK.

38:03

And finally, on top of that, we’ll apply a one hertz low pass filter if you’re interested in the various subtleties of cleaning up both post data and electro dermal activity data.

38:17

We have webinars that discussed this at length, So, but now, here we have it. It’s pretty well cleaned up.

38:27

And we’ll do one more thing here. We’ll go ahead and obtain the skin conductance responses.

38:35

So we’ll go to the electro dermal activity analysis. And, once, again, in the preferences, there’s a lot of options you can change. But we’re just going with defaults at the moment illustrating the process of locating the skin conductance responses.

38:51

So, taking the channel minds signal BVA, the software will create what’s called a phase signal, that it shows the change in the electro dermal. Cuba signal.

39:04

That’s this new channel three, which I will hide.

39:09

And then it marks the onset of each response, for instance, like right here, it’s illustrated by that open bracket.

39:20

Then we have the peak of the skin conductance response and then we have the opposite exactly how the mark just depends on the settings.

39:28

But at this point we have what we need to proceed to the next step.

39:34

So which is taking some measurements.

39:37

So, here, we have up above the measurement boxes, in AcqKnowledge, that’s where all the magic happens, and we’re obtaining measurements from the data.

39:47

Oh, wants to have six different measurements here.

39:51

So, I will change the settings, so I have two rows, three columns, each of measurement boxes, So, there they are.

40:01

OK, and I’ll set them up as follows.

40:05

The first one will be simply the Delta T.

40:10

So, that’s how much data we have highlighted. So, that just the duration in seconds.

40:20

The next one will be count of skin conductance responses.

40:26

So, if we go down here to Event Counts, we can say on our measurement channel, let’s count distinct conductance responses and make sure that the measurement channel is in fact correct. one, channel nine. So, if I highlight over here, I should have one. Right, this one here, and if I did this actually have three. So, it’s all working.

40:50

If not only do, we can also set up what are called calculation measurements.

40:54

So, I can look at the ratio of how many skin conductance responses we’re getting per unit time.

41:02

So I’m going to get, wrote a column two, divide it by rho a comb one. So I’m taking the number of skin conductance responses in a given selection, dividing it by the number of seconds.

41:18

And I’m reporting that isn’t measurements.

41:21

So, for instance like right here, I have four of these divided by the delta T.

41:29

So, that’s as far as skin conductance is concerned, let us go to concrete and rename this by double clicking on the label to H R.

41:40

Clarity.

41:41

Change the channel here to the HR and I’ll be looking now and measuring the mean rates.

41:52

So the mean heart rate like this. And then, oh.

41:57

So, I would like to know on the heart rate channel the standard deviation.

42:04

And finally, since I know that from the data that Patrick has generated, there is a lot of zero point events which are specified as flags. I’ll count those as well.

42:18

So let’s just go to Event Count, and I’m going to be looking at blacks, and those are called global events, OK, so that’s, that’s good.

42:30

And now let’s bring in the markers That happens from analysis know this menu, and then we can import markers from The Observer and there’s, one of the questions came during the poll. We can also export our own markers to The Observer. And additionally, if you’re using the face reader, bind all this, You can also import data from that. You can also link to Face Reader video. That’s a topic.

43:00

That’s just separate and lengthy in itself.

43:03

So let’s just go ahead and import markers from The Observer and click Open and boom, there they are.

43:10

So let’s look at what we have here, OK. So we’ll go ahead and open the event palette.

43:18

The event power shows all the different events that we have here. These are the global events. And let’s make it a bit easier to see. We have a bunch of selection begin, and selection, and markers.

43:34

And these corresponds to the different phases. So we have phase one selection begin right here, and then phase one ends a bit later on in NaN. We also have other ones that correspond to love and fight.

43:50

Then we also have the flags that correspond to jump scares.

43:55

This is how data were analyzed in The Observer.

43:59

Well, now that we have these, we’re ready to perform the analysis, but there’s one final step that we have to do, and Patrick can already calculated based on the time called signal that we need to correct for a two second shift in the data. It’s actually precisely NaN.

44:17

And we can go to transform, delay, and delay each of our two channels that carry the measurements by NaN.

44:27

And so this one as well, and we can just go through recently used NaN.

44:32

So now we’ve performed a shift here in the beta.

44:37

OK.

44:41

All right.

44:41

So let’s just, come back, to analysis, Find Psycho.

44:49

And now what we want to look at is events. So we’ll look at the different phases and so we can go here and look at the section begin. We want to finish our selection at selection end.

45:07

And we can be specific, we can say, We are looking for phases.

45:11

So, phase pace.

45:14

And you can see here, the four consecutive phases of the experiment: 1, 2, 3, and four, And so, a college will identify those correctly, and we can outputs either to excel, or, in this case, I’ll put them into the journal, our measurements. So, let’s just go ahead and click on Find or psychos. Start from the beginning. And we need to create a journal, because you don’t have one yet.

45:44

And then, we have it, these are our phases, so, I can just see here, phases phases, and they go, 1, 2, 3, 4.

45:54

We can repeat the exactly the same process to identify all the loved ones, so, they will look like that. You can see where they are.

46:04

And, we can, I know the cycle, starting from the beginning. And, just sit here, These are loved ones.

46:13

And, finally, we can have the pipe C’s that are identified. So, let’s just do that.

46:22

This time we go bytes!

46:30

OK, find on the site was from the beginning.

46:34

Let’s expand this window a little bit and see what we have here. The first column is delta T, the duration of time and duration of that phase.

46:43

Then we have how many skin conductance responses there were in the face.

46:48

The next number here is the ratio skin conductance responses. Over time, it carries a negative sign.

46:56

Simply because of the direction in which the delta T was selected then we’re looking at the mean heart rate and then we’re also looking at Humphry.

47:06

Standard Deviation in the Phylum column is showing us how many jump scares we have for that particular phase.

47:14

OK, Patrick will show me afterwards how you can import that data into The Observer. I just want to mention a couple more things here. If we want to perform analysis. That is specifically about the event points. So when are we seeing something with respect to the flags. We can go to.

47:38

Electro dermal activity individually, many different things you can do. But this is one example.

47:44

So, based on the …

47:47

Activity, we’ll go ahead and look at flags there types Stimulus Lobo events, And we’ll run the analysis. Again.

47:59

we can do Excel, or we can do, Journal this time we’ll do Excel. So we can have a little bit of variety here.

48:10

And we can see that there was one jumps kier Marker that, in fact, follows A sorry. There was one skin conductance response.

48:23

Just following the jump scare event, it’s NaN.

48:27

All the other ones did not immediately follow between one to NaN.

48:34

The actual jump scares.

48:35

So this is a useful analysis you can plug into your statistical software afterwards, OK? All of this could even be done with a single click operations although we did from beginning by using the buyback scripting language, see if you want to learn about that. That’s a separate topic, but you can just contact us. And there’s all sorts of other analysis procedures that can be automated. Like, heart rate variability, electromyography, EEG, etcetera.

49:07

Right, well, at this point, I’ll go ahead and to transfer over to Patrick, so he can show you how he has overwritten how he can import a set of measurements just like these that I passed onto him before we started here.

49:29

OK, thank you Alex.

49:32

So, what, what what we did will be prepared is take an export file from with events like Alex, just just demonstrated.

49:43

So, We have a number of kind of average values for different different selection.

49:54

The intervals between selection began and selection end.

49:58

And the only thing that’s required as you add a label.

50:01

So, you add a label to to enable import proper input into The Observer XT.

50:08

and then the There’s a time column.

50:11

And the column width the rate mean and the rates standard deviation which can be imported into The Observer as Numerical Values. So this is a file that I prepared and I’m now going to show you.

50:23

You can import that into an existing observation.

50:27

So we have our observation here with the automatically imported raw data and the manually coded events.

50:34

So I’m going to click, Import Data, Import Observational Data, and I’m gonna selectee, AcqKnowledge export labels, and import them as markers.

50:49

And this allows us to do additional analysis of results that were calculated in by OPEC AcqKnowledge, and I will show you in a second what it looks like in the visualization. So, here we can see that I have been imported as as markers.

51:09

So, Behavioral data and The Observer with the average, the rupee, the rate mean and the rate standard deviation.

51:19

If I go to the visualization, you will see that they also will also appear in the visualization here.

51:28

So, these are my manually coded data, and here are my results from analyze and BIOPAC AcqKnowledge. So, we have to face the faces, the love, and the fights.

51:43

And if I expand, we can see that for this result is 88.26 beats per minute, and the standard deviation of 2.3.

51:56

So, this, this, you know, doing exporting the markers to AcqKnowledge, do analysis on the, on the processed signals, and then importing it back into The Observer, allows you to take, to kind of take full advantage of both the data selection and integration possibilities in observer.

52:20

And all the analysis possibilities in, in AcqKnowledge, OK, a common control for a second back, Brenda.

52:34

Uh, thanks, I would like to correct an error, and then I made a bit earlier in the presentation. So, it’s very important, the order in which you perform the steps. And so, the markers, the event parker’s here, that defined the skin conductance events do, those should be calculated after you and the delay to the channel and what I did was actually apply the delay afterwards which is not what you want to do. So I’ve repeated the steps here, OK? It’s very easy to override the results.

53:12

All we have to do is go to electro dermal.

53:18

Activity, Lockheed skin conductance responses in front of the game. And erase the previous markers that exist there.

53:28

And then you can, essentially, correct. So I’m gonna pick your race. So now the markers in the courts are in the correct place.

53:37

So I’m sorry about that, and hopefully, everybody’s been following it from start to finish, so you can see how this step is needed to make sure that the markers end up in the correct place.

53:48

OK, so I’m finished with this part myself.

53:53

OK, so back to Patrick.

53:57

Well, yes, so I finished The Observer part.

54:00

So I guess now, I swear, as we’re close to the end of the webinar at least, well, we’ve finished the demonstration part.

54:11

I want to kind of show you, I’ll tell you a little bit about our company.

54:17

All we are, So, we are now all this information technology were founded over 30 years ago by are still our CEO, because notice, that’s, that’s where the name comes from the oldest.

54:31

And, at the moment, we have 160 employees in O in nine different countries.

54:37

Our headquarters are in the athenaeum, which is a small university town, in the Netherlands, and on the east, the East border. And we have offices in China, and an American office, in Leesburg, Virginia.

54:54

We have quite a large customer base, so all the countries here displayed in green are countries where we have customers.

55:05

So, we provide complete solutions for behavioral research in humans.

55:12

As you have seen today. So, we, we provide all kinds of systems that study the behavioral behavior of Humans in any form or shape.

55:23

So, we have labs, behavioral labs, but we also can integrate with eye trackers with data acquisition systems, such as by OPEC. So, basically, if you want to study behavior, we have all the tools you need to do that.

55:38

And besides products for human behavior.

55:41

We also have a lot of systems for, uh, measuring quantify animal behavior in terms of locomotion positions, in, in, in different environments. But, also, we have a lot of a couple of systems that’s kinda quantify the gate of animals.

56:00

So, well, that’s in a nutshell, an overview of what we do if you are interested in knowing more than Brenda Will, at the end, show you the the website and our e-mail address at which you can contact us. So, back to you, Brenda.

56:18

All right. So now, a few words about … as well.

56:23

Oh, um.

56:25

Over 99% of the top universities, from run by Aipac.

56:29

And if you Google on the web, you can see that there’s tens of thousands of citations using our equipment, which is pretty much a gold standard in front research in education.

56:41

And you can see here, some of the various recording platforms that we have in P 160, which was used here, a modular platform.

56:52

We can mix wireless and wired channels 60 at a time, but you can buy many of these … for big setups.

57:01

The wireless units, the new generation smart amplifiers, they are extremely tiny, but give you really good signals. And, then, we also have the four channel system MP 26 R, which gives universal amplifiers. So they can be configured for all sorts of signals.

57:18

And just overall by APAC is involved heavily both in research and education. It’s pretty much an even split. We provide educational products. And then we also provide research products.

57:33

And we’re looking at the research feuds.

57:36

We also want to have all these sorts of stimulating your participant in different sensory modalities and then report the responses.

57:46

So, we can be talking about electrical stimulation thermo stimulation have to take transcranial direct transformation, olfactory, etcetera. And then, recording physiological data like ECG, EEG, Functional near infrared, skin conductance.

58:05

You name it pretty much anything you can imagine, combining that with the eye tracking data video, virtual reality, performing experiments in MRI, where we’ve brought in practically all the sensors that we’re also able to do in the field.

58:24

And some other reporting platforms include the smarts into very small and you take it in a bag. We have to be alert EEG systems. And of course, our teaching systems and we also now have a solution for distance learning. If any of you are concerned and these are our functional near infrared systems, which function on your infrared becoming more popular in the research field and in education.

58:51

And that allows you to measure changes in oxygenated and deoxygenated hemoglobin from the brain is can be wireless and can be wired, Many channels are a few, and you can have an educational or a research version.

1:00:02

Almost almost done getting all the answers, and it looks like most people have never acquired an analyzer worked with external data.

1:00:11

When dealing with the server X, T, and the rest of the categories are, I’m equal, zero, 0%, are an external data, and part master.

1:00:28

OK, so, I’ll just go ahead and take over the screen here, and if I need to make you guys the presenter.

1:00:37

Let me now, OK, if there’s something you want to show. So, back to the top of the questions. I know some people have been waiting, and Alex is not turning on campus. He’s having some bandwidth issues. But, Alex, could you address the importance of cleaning the data prior to any parts of The Observer?

1:00:57

Yeah. I mean, prior to any sort of data analysis that you’re going to perform, you want to inspect the data. And more often than not, there are corrections that have to be made.

1:01:11

Data that we have been looking at has been very clean, but there, in addition to filtering, there may be artifacts that have to be removed and identified.

1:01:21

So, it is, in most cases, absolutely essential to do that unless you think something extremely, extremely trivial, Right, where you can accept certain percentage of error, and still get reasonable results. But that’s an exception from the rule.

1:01:40

So, it is very convenient to get the data directly into The Observer, but it’s also, I think, very important that you perform the required steps for data cleanup, filtering analysis, and then bring it in, again, if you want to have the Dataproc becki and or maybe just the results like we demonstrated with pasting in the results, journal and importing as markers.

1:02:12

Our various Webinars for different physiological signals, I think in that context can be very helpful because the each cover identifying good imbed data, what steps to take to fix any issues, et cetera, OK?

1:02:29

What about Is there a mark asked? Is there a limit on the number of channels that can be run at the same time?

1:02:37

More, I guess it depends now on which software we’re talking about. So, I’m sort of the AcqKnowledge side.

1:02:44

First of all, theoretically, you can have lots and lots of channels, right? So there is a limit by the empty 160 can do up to 16 analog channels.

1:02:55

And then you can have a lot of calculation channels and digital channels. And you can record from multiple … on the same computer, but at some point, bandwidth is an issue. So on a single machine, practically, three, M, P, 160, there are probably a very realistic limit. So you will get you about 48 channels are. So. you can connect to, you can synchronize many, many, many empty 160 units, fear, hardware, synchronization cables that we provide, and we have to perform such large-scale setups. So then, essentially, the channel number is, is huge, And you can import all of them, you know, knowledge together in the end, If you want to be, probably don’t want to. Cause there’s just a lot of data.

1:03:48

OK, thank you see, here.

1:03:53

OK, so we’ve got, in terms of measuring synchronicity between, this is from Madeleine.

1:03:59

In terms of measuring synchronicity between the behavioral markers from observer to AcqKnowledge data.

1:04:05

Does this process work for multiple people simultaneously, IE having a nuclear family, each wearing an ETA bracelet and having multiple cameras picking up the face.

1:04:18

If there are all, essentially they’re all connected to the same bapak system or even to multiple buybacks systems. First of all, we can trigger start all the buybacks systems in exactly the same time.

1:04:32

Bye, know, by a single switch. So all of the bab tissue channels can be sure to be synchronized within a fraction of a millisecond.

1:04:42

And then you simply need a single pathway to synchronize from The Observer to any one of those buybacks systems. So the short answer is yes, totally, you can do that.

1:04:54

Yeah, Alec’s, maybe you remember the one of the universities in the Netherlands where they had a system wide eyed measure simultaneously EDA in 10 participants. Yup.

1:05:06

And that went, yeah. That was perfectly fine. So yes. The answer is yes.

1:05:14

Yeah?

1:05:16

OK, great. Thank you guys.

1:05:18

And then the bunch of people have asked about integrating eye tracking and other stimuli. Does that work with us as well?

1:05:27

Combined systems?

1:05:29

Yeah, this works in both, right? So we can I think we can answer yes here on both ends.

1:05:35

Yeah. Yeah. Well, from the server side, we can integrate in a similar way, as we can integrate with, with both AcqKnowledge, with with with Toby Tobi software.

1:05:46

So, it works, kind of a similar way.

1:05:51

And then by OFAC integrates with our guests Sen.

1:05:57

So, there’s a lot of opportunities for integrations there with the BIOPAC software.

1:06:02

Yes, I mentioned we do also evolved from the synchronization and import, you know, and importing of the Tobii Eye Tracker data. So essentially, you can, you know, pick the right record. You have a way to go.

1:06:16

Mmm hmm.

1:06:17

Right, OK. So then what about how can this be used with an EEG system?

1:06:25

Boom.

1:06:27

Yeah. Maybe a little bit. General of a question. But, yes, we have various EEG platforms. So they’re just channels in the software right. Just like all the other channels.

1:06:42

So, it’s no different than recording EA or ECG. They just appear as channels and they will be sent to The Observer just the same way as anything else. So fundamentally, it will behave the same way.

1:06:57

OK, and can Face Reader also be integrated?

1:07:01

Yes. I think again, yes, in both platforms.

1:07:05

I’m just I’m just getting through the questions here.

1:07:06

I know Yeah. Yeah, sorry for taking the lead on a lot of the questions. We’re going to just seem to be just jumping well. Yeah, yeah, no, no, it’s fine. It’s fine.

1:07:19

OK, so here’s one, I think maybe, Or, maybe it’s for both of you, what are the best methods for measuring emotions?

1:07:41

acquire physiological and behavioral data?

1:07:43

I think when, you know, emotions are, well, it’s a very complex concepts, but I think, in our experience, if you want to measure emotional responses of people, of course, you can, you can look at facial expressions, so that, that, that kind of represent the emotional state of people.

1:08:04

But, I think it’s very risky to only rely on facial expressions.

1:08:08

So, you also always should take into account body language, but also, physiological signals.

1:08:15

Because it could be that the facial expressions, you know, depending on the circumstances, people, are not always very expressive, and showing their emotions, depending on, on the, on the context of the test.

1:08:29

So, often, you have to rely on body language, in combination with physiological signals.

1:08:37

And based on the, on multiple signals, you can try to kind of look for patterns in the combination of different signals to kind of infer the emotional state or response of, of, of, of, of the participants.

1:08:56

Um.

1:08:57

So I would say it’s cool.

1:08:59

Use a combination of different signals.

1:09:03

Alex? Yeah, Totally of course. Like I think every time you want to map a physiological, sorry, a psychological state, to a physiological state and you know, try to get some insight into what the participants is exploring the more The more data you are working with the better your chances and but it also depends what is emotion. How you define it, it’s not an easy question.

1:09:28

If you’re just interested in positive versus negative axis of response, you can look at the facial EMG activation in the … corrugate or muscles.

1:09:38

If you’re looking at intensity, you can look at measures of sympathetic, parasympathetic arousal weed seed for, like maybe skin conductance and so it really goes down to how you’re defining emotion and and finally, facial EMG. I definitely think has good use.

1:09:58

if you’re able to employ your protocol because as Patrick mentioned, it can detect very subtle activations and the muscles which actually do not result in overt movement and you can pick up those changes in the potentials with electrodes. So lots of options. Just depends what you’re trying to quantify. Exactly.

1:10:23

Great, thanks. Thanks, everyone.

1:10:26

So great job, Patrick and Alex, thank you so much for all of the work that you put in that’s presentation and showcasing the integration.

1:10:38

It really was engaging webinar and I know the attendees are appreciative.

1:10:45

I know that some people have some follow up questions about eye tracking an SMR.

1:10:49

I think it’s probably best for you guys speak to our support department Take apart they can answer your question more easily, So I have some closing comments.

1:11:01

I want to just let everyone know that the we’ve run through the time for the Q&A, and I’m really excited about this webinar.

1:11:10

In fact, Alex, I would love for you to pop your video on just for a second screen size.

1:11:16

Yeah. Let me try to go to the secondary camera I think the first one is using a bit too much bandwidth Let me just try it here give me a second.

1:11:27

OK, so switching over to the second one and She’ll be up in a second.

1:11:35

  1. Yeah, I see oh, yeah, I wanted to do something for.

1:11:43

That’s maybe not gonna work for me, OK, well let me get your video separately.

1:11:48

Hang on one SEC, OK.

1:11:49

So one more side of that and while everyone is just hanging hanging on there, I will let you know that there’s a lot of information on the Noldus and BIOPAC websites.

 

 

1:12:35

So, definitely copy these links down, and go back and look at the list of webinars that are available. Today’s webinar was recorded, and we’re going to e-mail you a link of the recording, the slides and the Q&A document over the next few weeks.

 

1:13:04

Feel free to visit Noldus.com and …

1:13:07

on BIOPAC.com for additional references, an information about the products showcased here, training documents, videos, and future events.

1:13:17

Speakers, anything else before we conclude?

1:13:22

Well, I’d like to thank everyone for joining and it was a real pleasure presenting this webinar with BIOPAC.

1:13:32

Likewise, and thanks. Thanks Patrick, and especially thanks to those of you who are waking up early in the morning to join us and we’re at the end of the day. So, we really appreciate it.

 

 

1:14:29

Thanks, audience, for your questions and for your engagement. We really appreciate your time. And that concludes today’s webinar. Thank you again, everyone. Stay safe.

WHAT'S NEW

New Citations | BIOPAC in Biology Research

Biology research covers a wide variety of studies all aiming to understand living organisms...

Join the BIOPAC Community

Stay Current

Stay Connected

Request a Demonstration
Request a Demonstration