gpt4 book ai didi

c# - Kinect 横向骨骼追踪

转载 作者:IT王子 更新时间:2023-10-29 03:55:45 26 4
gpt4 key购买 nike

目前我正在使用 Microsoft Kinect 测量关节之间的角度。大多数测量工作正常。每当一个人侧身(坐在椅子上)时,Kinect 就不会准确地跟踪骨骼。为了说明我的问题,我添加了 3 张 Kinect 深度 View 的图片。

Seated sideways measurement with skeleton tracking

Seated sideways measurement without skeleton tracking

Sideways measurement with skeleton tracking

如您所见,三分之二的测量结果“正确”。每当我抬起腿时,Kinect 就会正确停止骨骼跟踪。有没有人能解决这个问题,或者这只是 Kinect 的局限性?

谢谢。

更新 1:屏幕截图 2 中显示的这些跟踪关节上的 JointTrackingState-Enumeration 标记为 Inferred,但深度 View 正在跟踪我的全身。

更新 2:在屏幕截图 2 中,我试图追踪我的前腿,突出显示为绿色。我知道另一条腿没有被追踪,但我想这并不重要。

更新 3:以下代码选择骨架:

private Skeleton StickySkeleton(Skeleton[] skeletons)
{
if (skeletons.Count<Skeleton>(skeleton => skeleton.TrackingId == _trackedSkeletonId) <= 0)
{
_trackedSkeletonId = -1;
_skeleton = null;
}

if (_trackedSkeletonId == -1)
{
Skeleton foundSkeleton = skeletons.FirstOrDefault<Skeleton>(skeleton => skeleton.TrackingState == SkeletonTrackingState.Tracked);

if (foundSkeleton != null)
{
_trackedSkeletonId = foundSkeleton.TrackingId;
return foundSkeleton;
}
}

return _skeleton;
}

每当跟踪骨骼时,数据将用于绘制关节点和计算关节之间的角度。

更新 4:我测试过坐在一个“ block ”上,比椅子简单得多。不幸的是,Kinect 仍然表现相同。

以下2张截图:

Sitting on a block 1

Sitting on a block 2

最佳答案

正如 Renaud Dumont 所说,我会用 JointTrackingState 做一些事情。由于您使用膝盖,我使用变量 leftkneerightknee 来执行此操作,它们是 Joints。这是代码,您可以使用 JointType.FootRightJointType.FootLeft 以及 Hip 类型,但我会把它留给您。

static Skeleton first = new Skeleton();

Joint leftknee = first.Joints[JointType.KneeLeft];
Joint rightknee = first.Joints[JointType.KneeRight];

if ((leftknee.TrackingState == JointTrackingState.Inferred ||
leftknee.TrackingState == JointTrackingState.Tracked) &&
(rightknee.TrackingState == JointTrackingState.Tracked ||
rightknee.TrackingState == JointTrackingState.Inferred))
{

}

或者,如果您只想一次跟踪一个膝盖,或同时跟踪两个膝盖,您可以这样做:

if ((leftknee.TrackingState == JointTrackingState.Inferred ||
leftknee.TrackingState == JointTrackingState.Tracked) &&
(rightknee.TrackingState == JointTrackingState.Tracked ||
rightknee.TrackingState == JointTrackingState.Inferred))
{

}

else if (leftknee.TrackingState == JointTrackingState.Inferred ||
leftknee.TrackingState == JointTrackingState.Tracked)
{

}

else if (rightknee.TrackingState == JointTrackingState.Inferred ||
rightknee.TrackingState == JointTrackingState.Tracked)
{

}

仅供引用,Skeleton firststatic 的原因是因为它可以用于制作关节

 static Skeleton first;

反对

 Skeleton first;

编辑1


我得出的结论是,这 极其很难做到,我认为上述方法会奏效,但我只是想包括我正在做的事情,以防你能找到一些让它工作的方法。无论如何这是我正在处理的代码,它是另一个 class 这只是另一个 SkeletalTrackingState 我试图制作一个 Inferred enum 在里面。但不幸的是 enumimpossible 继承。如果您发现这种效果有效,我将永远尊重您,因为您是我的优秀程序员;)。事不宜迟:我试图制作的 .dll:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.Kinect;

namespace IsInferred
{
public abstract class SkeletonInferred : Skeleton
{
public bool inferred;
static Skeleton first1 = new Skeleton();
Joint handright;
Joint handleft;
Joint footright;
Joint footleft;
Joint ankleleft;
Joint ankleright;
Joint elbowleft;
Joint elbowright;
Joint head;
Joint hipcenter;
Joint hipleft;
Joint hipright;
Joint shouldercenter;
Joint shoulderleft;
Joint shoulderright;
Joint kneeleft;
Joint kneeright;
Joint spine;
Joint wristleft;
Joint wristright;

public SkeletonInferred(bool inferred)
{

}

public enum Inferred
{
NotTracked = SkeletonTrackingState.NotTracked,
PositionOnly = SkeletonTrackingState.PositionOnly,
Tracked = SkeletonTrackingState.Tracked,
Inferred = 3,
}

private void IsInferred(object sender, AllFramesReadyEventArgs e)
{
handright = first1.Joints[JointType.HandRight];
handleft = first1.Joints[JointType.HandLeft];
footright = first1.Joints[JointType.FootRight];
footleft = first1.Joints[JointType.FootLeft];
ankleleft = first1.Joints[JointType.AnkleLeft];
ankleright = first1.Joints[JointType.AnkleRight];
elbowleft = first1.Joints[JointType.ElbowLeft];
elbowright = first1.Joints[JointType.ElbowRight];
head = first1.Joints[JointType.Head];
hipcenter = first1.Joints[JointType.HipCenter];
hipleft = first1.Joints[JointType.HipLeft];
hipright = first1.Joints[JointType.HipRight];
shouldercenter = first1.Joints[JointType.ShoulderCenter];
shoulderleft = first1.Joints[JointType.ShoulderLeft];
shoulderright = first1.Joints[JointType.ShoulderRight];
kneeleft = first1.Joints[JointType.KneeLeft];
kneeright = first1.Joints[JointType.KneeRight];
spine = first1.Joints[JointType.Spine];
wristleft = first1.Joints[JointType.WristLeft];
wristright = first1.Joints[JointType.WristRight];

if (handleft.TrackingState == JointTrackingState.Inferred &
handright.TrackingState == JointTrackingState.Inferred &
head.TrackingState == JointTrackingState.Inferred &
footleft.TrackingState == JointTrackingState.Inferred &
footright.TrackingState == JointTrackingState.Inferred &
ankleleft.TrackingState == JointTrackingState.Inferred &
ankleright.TrackingState == JointTrackingState.Inferred &
elbowleft.TrackingState == JointTrackingState.Inferred &
elbowright.TrackingState == JointTrackingState.Inferred &
hipcenter.TrackingState == JointTrackingState.Inferred &
hipleft.TrackingState == JointTrackingState.Inferred &
hipright.TrackingState == JointTrackingState.Inferred &
shouldercenter.TrackingState == JointTrackingState.Inferred &
shoulderleft.TrackingState == JointTrackingState.Inferred &
shoulderright.TrackingState == JointTrackingState.Inferred &
kneeleft.TrackingState == JointTrackingState.Inferred &
kneeright.TrackingState == JointTrackingState.Inferred &
spine.TrackingState == JointTrackingState.Inferred &
wristleft.TrackingState == JointTrackingState.Inferred &
wristright.TrackingState == JointTrackingState.Inferred)
{
inferred = true;
}
}
}
}

你项目中的代码(编译错误)

    using IsInferred;

static bool Inferred = false;
SkeletonInferred inferred = new SkeletonInferred(Inferred);
static Skeleton first1 = new Skeleton();

Skeleton foundSkeleton = skeletons.FirstOrDefault<Skeleton>(skeleton => skeleton.TrackingState == SkeletonTrackingState.Inferred);

祝你好运,我希望这能帮助你朝着正确的方向前进或对你有所帮助!

我的代码


这是您要求的我的代码。是的,它来自 Skeletal Tracking Fundamentals ,但是这段代码就在这里,我不想用大部分相同的东西开始一个新项目。享受!

代码

     // (c) Copyright Microsoft Corporation.
// This source is subject to the Microsoft Public License (Ms-PL).
// Please see http://go.microsoft.com/fwlink/?LinkID=131993 for details.
// All other rights reserved.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Navigation;
using System.Windows.Shapes;
using Microsoft.Kinect;
using Coding4Fun.Kinect.Wpf;

namespace SkeletalTracking
{
/// <summary>
/// Interaction logic for MainWindow.xaml
/// </summary>
public partial class MainWindow : Window
{
public MainWindow()
{
InitializeComponent();
}

bool closing = false;
const int skeletonCount = 6;
Skeleton[] allSkeletons = new Skeleton[skeletonCount];

private void Window_Loaded(object sender, RoutedEventArgs e)
{
kinectSensorChooser1.KinectSensorChanged += new DependencyPropertyChangedEventHandler(kinectSensorChooser1_KinectSensorChanged);

}

void kinectSensorChooser1_KinectSensorChanged(object sender, DependencyPropertyChangedEventArgs e)
{
KinectSensor old = (KinectSensor)e.OldValue;

StopKinect(old);

KinectSensor sensor = (KinectSensor)e.NewValue;

if (sensor == null)
{
return;
}




var parameters = new TransformSmoothParameters
{
Smoothing = 0.3f,
Correction = 0.0f,
Prediction = 0.0f,
JitterRadius = 1.0f,
MaxDeviationRadius = 0.5f
};
sensor.SkeletonStream.Enable(parameters);

//sensor.SkeletonStream.Enable();

sensor.AllFramesReady += new EventHandler<AllFramesReadyEventArgs>(sensor_AllFramesReady);
sensor.DepthStream.Enable(DepthImageFormat.Resolution640x480Fps30);
sensor.ColorStream.Enable(ColorImageFormat.RgbResolution640x480Fps30);

try
{
sensor.Start();
}
catch (System.IO.IOException)
{
kinectSensorChooser1.AppConflictOccurred();
}
}

void sensor_AllFramesReady(object sender, AllFramesReadyEventArgs e)
{
if (closing)
{
return;
}

//Get a skeleton
Skeleton first = GetFirstSkeleton(e);

if (first == null)
{
return;
}



//set scaled position
//ScalePosition(headImage, first.Joints[JointType.Head]);
ScalePosition(leftEllipse, first.Joints[JointType.HandLeft]);
ScalePosition(rightEllipse, first.Joints[JointType.HandRight]);
ScalePosition(leftknee, first.Joints[JointType.KneeLeft]);
ScalePosition(rightknee, first.Joints[JointType.KneeRight]);

GetCameraPoint(first, e);

}

void GetCameraPoint(Skeleton first, AllFramesReadyEventArgs e)
{

using (DepthImageFrame depth = e.OpenDepthImageFrame())
{
if (depth == null ||
kinectSensorChooser1.Kinect == null)
{
return;
}


//Map a joint location to a point on the depth map
//head
DepthImagePoint headDepthPoint =
depth.MapFromSkeletonPoint(first.Joints[JointType.Head].Position);
//left hand
DepthImagePoint leftDepthPoint =
depth.MapFromSkeletonPoint(first.Joints[JointType.HandLeft].Position);
//right hand
DepthImagePoint rightDepthPoint =
depth.MapFromSkeletonPoint(first.Joints[JointType.HandRight].Position);

DepthImagePoint rightKnee =
depth.MapFromSkeletonPoint(first.Joints[JointType.KneeRight].Position);

DepthImagePoint leftKnee =
depth.MapFromSkeletonPoint(first.Joints[JointType.KneeLeft].Position);


//Map a depth point to a point on the color image
//head
ColorImagePoint headColorPoint =
depth.MapToColorImagePoint(headDepthPoint.X, headDepthPoint.Y,
ColorImageFormat.RgbResolution640x480Fps30);
//left hand
ColorImagePoint leftColorPoint =
depth.MapToColorImagePoint(leftDepthPoint.X, leftDepthPoint.Y,
ColorImageFormat.RgbResolution640x480Fps30);
//right hand
ColorImagePoint rightColorPoint =
depth.MapToColorImagePoint(rightDepthPoint.X, rightDepthPoint.Y,
ColorImageFormat.RgbResolution640x480Fps30);

ColorImagePoint leftKneeColorPoint =
depth.MapToColorImagePoint(leftKnee.X, leftKnee.Y,
ColorImageFormat.RgbResolution640x480Fps30);

ColorImagePoint rightKneeColorPoint =
depth.MapToColorImagePoint(rightKnee.X, rightKnee.Y,
ColorImageFormat.RgbResolution640x480Fps30);



//Set location
CameraPosition(headImage, headColorPoint);
CameraPosition(leftEllipse, leftColorPoint);
CameraPosition(rightEllipse, rightColorPoint);


Joint LEFTKNEE = first.Joints[JointType.KneeLeft];
Joint RIGHTKNEE = first.Joints[JointType.KneeRight];

if ((LEFTKNEE.TrackingState == JointTrackingState.Inferred ||
LEFTKNEE.TrackingState == JointTrackingState.Tracked) &&
(RIGHTKNEE.TrackingState == JointTrackingState.Tracked ||
RIGHTKNEE.TrackingState == JointTrackingState.Inferred))
{
CameraPosition(rightknee, rightKneeColorPoint);
CameraPosition(leftknee, leftKneeColorPoint);
}

else if (LEFTKNEE.TrackingState == JointTrackingState.Inferred ||
LEFTKNEE.TrackingState == JointTrackingState.Tracked)
{
CameraPosition(leftknee, leftKneeColorPoint);
}

else if (RIGHTKNEE.TrackingState == JointTrackingState.Inferred ||
RIGHTKNEE.TrackingState == JointTrackingState.Tracked)
{
CameraPosition(rightknee, rightKneeColorPoint);
}
}
}


Skeleton GetFirstSkeleton(AllFramesReadyEventArgs e)
{
using (SkeletonFrame skeletonFrameData = e.OpenSkeletonFrame())
{
if (skeletonFrameData == null)
{
return null;
}


skeletonFrameData.CopySkeletonDataTo(allSkeletons);

//get the first tracked skeleton
Skeleton first = (from s in allSkeletons
where s.TrackingState == SkeletonTrackingState.Tracked
select s).FirstOrDefault();

return first;

}
}

private void StopKinect(KinectSensor sensor)
{
if (sensor != null)
{
if (sensor.IsRunning)
{
//stop sensor
sensor.Stop();

//stop audio if not null
if (sensor.AudioSource != null)
{
sensor.AudioSource.Stop();
}


}
}
}

private void CameraPosition(FrameworkElement element, ColorImagePoint point)
{
//Divide by 2 for width and height so point is right in the middle
// instead of in top/left corner
Canvas.SetLeft(element, point.X - element.Width / 2);
Canvas.SetTop(element, point.Y - element.Height / 2);

}

private void ScalePosition(FrameworkElement element, Joint joint)
{
//convert the value to X/Y
//Joint scaledJoint = joint.ScaleTo(1280, 720);

//convert & scale (.3 = means 1/3 of joint distance)
Joint scaledJoint = joint.ScaleTo(1280, 720, .3f, .3f);

Canvas.SetLeft(element, scaledJoint.Position.X);
Canvas.SetTop(element, scaledJoint.Position.Y);

}


private void Window_Closing(object sender, System.ComponentModel.CancelEventArgs e)
{
closing = true;
StopKinect(kinectSensorChooser1.Kinect);
}

private void kinectDepthViewer1_Loaded(object sender, RoutedEventArgs e)
{

}

}
}

XAML

<Window x:Class="SkeletalTracking.MainWindow"
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
Title="MainWindow" Height="600" Width="800" Loaded="Window_Loaded"
xmlns:my="clr-namespace:Microsoft.Samples.Kinect.WpfViewers;assembly=Microsoft.Samples.Kinect.WpfViewers"
Closing="Window_Closing" WindowState="Maximized">
<Canvas Name="MainCanvas">
<my:KinectColorViewer Canvas.Left="0" Canvas.Top="0" Width="640" Height="480" Name="kinectColorViewer1"
Kinect="{Binding ElementName=kinectSensorChooser1, Path=Kinect}" />
<Ellipse Canvas.Left="0" Canvas.Top="0" Height="50" Name="leftEllipse" Width="50" Fill="#FF4D298D" Opacity="1" Stroke="White" />
<Ellipse Canvas.Left="100" Canvas.Top="0" Fill="#FF2CACE3" Height="50" Name="rightEllipse" Width="50" Opacity="1" Stroke="White" />
<my:KinectSensorChooser Canvas.Left="250" Canvas.Top="380" Name="kinectSensorChooser1" Width="328" />
<Image Canvas.Left="66" Canvas.Top="90" Height="87" Name="headImage" Stretch="Fill" Width="84" Source="/SkeletalTracking;component/c4f-color.png" />
<Ellipse Canvas.Left="283" Canvas.Top="233" Height="23" Name="leftknee" Stroke="Black" Width="29" />
<Ellipse Canvas.Left="232" Canvas.Top="233" Height="23" Name="rightknee" Stroke="Black" Width="30" />
</Canvas>

这里有一张图片只是为了展示 Kinect 有时会如何关闭:

提示:注意如何只检测到我的 ARM 和部分背景

关于c# - Kinect 横向骨骼追踪,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/10192476/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com