- Home
- Conference Proceedings
- Qatar Foundation Annual Research Forum Proceedings
- Conference Proceeding
Qatar Foundation Annual Research Forum Volume 2011 Issue 1
- Conference date: 20-22 Nov 2011
- Location: Qatar National Convention Center (QNCC), Doha, Qatar
- Volume number: 2011
- Published: 20 November 2011
101 - 200 of 281 results
-
-
Design of Validation Study of the Lower Extremity Functional Status Scale for Anterior Cruciate Ligament Reconstruction in Arabic Speaking Athletes
Authors: Amy Leona Sandridge and Michael SarestskyAbstractBackground: The Lower Extremity Functional Scale (LEFS) has been found to be reliable and valid in several populations and languages. This 20-question scale is a self-report measure, which assesses the ability of persons with any musculoskeletal condition related to the lower extremity from 0 to 80. The objective measurements used to compare it will be the 40-meter fast self-paced walk [SW], timed up-and-go [TUG] and 10-step stair test [ST].
Objectives: To validate the Arabic LEFS in an Arabic speaking, male, athletic population who are recovering from anterior cruciate ligament (ACL) reconstruction. To compare the LEFS with objective measurements (SW, TUG and ST).
Methods: 100 male, Arabic speaking athletes will be followed for one year. Athletes will complete the LEFS prior to surgery, one week after surgery, and every week after surgery until 12 weeks. Then they will complete the LEFS at monthly intervals from six months to one year. Alongside the LEFS they will also complete the SW, TUG and ST. These will be reported by time, pain and exertion.
Results: Based on pilot data collected from 92 athletes the following results have been obtained. Average age of athletes was 25 years. They had played their chosen sport on average for three years prior to injury. All athletes were injured in the pursuit of sport although it was not always their own competitive sport. With respect to the LEFS, 14 patients had baseline visits. The baseline LEFS score ranged from 22 to 64 with mean of 51. One week following surgery the range was greater: 11 to 76 with a mean of 44. By 12 weeks, lost to follow up became an issue with only 7 of the 92 patients returning. Final assessments were made after week 30. The results showed a range of 67 to 78 with a mean of 74.
Conclusions: (1) The LEFS appears valid in this population however without adherence to the proposal no statistical tests for significance can be performed. (2) A research assistant will be required in order to maintain the study proposal requirements, specifically regular follow up of the athletes.
-
-
-
Temperature Circadian Variations in Worker in a Hot Environment in Qatar
AbstractBackground: Body core temperature fluctuates during the day following a sinusoidal variation with a maximum acrophase in the late afternoon. This circadian rhythm is mainly endogenous but it can be influenced by environmental factors such as work and social and physical activities.
Objectives: 1) To verify if aluminum shift-workers would present different core temperatures at different times of the day (i.e. diurnal variations); 2) To characterize these diurnal variations and their consequences.
Methods: Twenty-nine employees from the aluminum industry participated in this preliminary study. They worked indoors where the temperature was typically in excess of 40°C. In addition, each worker wore protective clothing consisting of suit, gloves and mask. Core temperature (ingestible pill) data covering a 24-hour circadian cycle were obtained in 10 workers during morning, afternoon and night shifts. Circadian variation in temperature was characterized using a cosinus function (cosinor model). The mesor (average) and acrophase of the function have been calculated for each participant.
Results: Core temperatures recorded on the work site were significantly higher in the afternoon or early evening (from 12:00h to 20:00h) than at night or during the early morning (from 21:00h to 08:00h). These differences were not triggered by the work duration but by the time of day. There were large differences between the individual accrophase times, probably due to different working activities as well as the influence of synchronization/shift from the previous days. However, core temperature was consistently higher in the afternoon than during the morning or night. The circadian variation in body core temperature showed a mesor of 37.45°C. This represents an average core temperature half a degree higher than generally observed in the general population at rest.
Conclusions: The current data showed that workers from the aluminum industry present a relatively elevated average core temperature, with the highest values being reached during the afternoon shift. This suggests that special attention should be given to the afternoon shift and that break/cooling procedures should be implemented if necessary. These preliminary observations have to be completed by clinical and behavioral observations.
-
-
-
Prevalence of Smoking and Exposure to Secondhand Smoke among Qatari School Children: Results from the Pilot Phase of the National Epidemiological Study of Lung Health among Qatari National School Children
Authors: Amy Leona Sandridge, Hana Said, Amjad Tuffaha and William GreerAbstractSecondhand exposure to tobacco smoke (SHS) has been proposed to potentially increase risk of acute respiratory infections, middle ear disease, exacerbated asthma and decreased lung function in children.
The objectives of this study of Qatari schoolchildren were six-fold: to assess feasibility of a national study on athletic participation, healthy living and lung function; to provide estimates of height and weight; to estimate the prevalence of exposure to SHS; to assess potential bias of informant; to estimate prevalence of smoking; to compare results of reported exposure to SHS and reported smoking using levels of saliva cotinine (SC).
This pilot phase of the National Epidemiological Study of Lung Health among Qatari Schoolchildren collected data from 321 boys and 413 girls enrolled in government schools in grades 7 to 12 using questionnaires administered by trained native Arabic research staff from October 2008 to April 2009. SC samples, height, weight and spirometry data were collected.
Mean Body Mass Index percentile ranged from 42nd percentile among 19 year old boys to 76th among 17 year olds. Among girls the range was narrower: from 61st percentile in 17 years to 86th in 11 year olds. For male schoolchildren, mothers answered 38% of the questionnaires while fathers answered 62%. For daughters, mothers responded for 58% and fathers for 42%. We found that mothers were more likely to report higher amounts of exposure to SHS than fathers especially for daughters. Fathers reported little exposure to SHS. There were 106 children who showed exposure to nicotine by SC level. Of these, 14 (13%) reported that they were smokers.
Seventy-two percent of children were reported to have been exposed to SHS. This varied by sex of child and reporting parent. The finding on potential reporting bias between mothers and fathers has implications for the future national study. The reported prevalence of smoking among this population was 3%.
Conclusions: (1) Qatari schoolchildren are exposed to SHS; (2) The national study must be designed to control for respondent bias; (3) The national study is feasible.
-
-
-
Negative Influence of Intermittent Ramadan Fasting and Unhealthy Lifestyle on Body Composition, Sleep, Physical Fitness and Iron Indices in School Boys
AbstractBackground: Schoolchildren must practice healthy diet as well as active lifestyle to support their physical growth and development. Previous studies have shown that intermittent fasting can affect dietary intake, sleep duration and circadian patterns among adults, but there is a lack of related literature in schoolchildren.
Purpose: The aim of this study was to objectively assess the effect of Ramadan fasting on physiological parameters in young children.
Methods: Eighteen boys aged 12.6±1.5 years were assessed at baseline (BR) and followed up twice during Ramadan (1st week [R1], 4th week [R4]) and once two weeks after the end of Ramadan (AR). Body composition was assessed using anthropometry and DXA scan. Blood investigations included complete blood count, lipid profile analysis and iron indices. Pattern of daily activity and core body temperature were recorded using a triaxial accelerometer and ingestible thermistor pill, respectively. Dietary intake was assessed by experienced nutritionist based on digital images of food and drinks consumed by each participant during a 24-hour period. Repeated sprints tests (RSA) of 6 × 15 m sprints interspaced by 15 s rest were performed to evaluate fatigue resistance.
Results: There was a shift in daily peak activity from daytime (5:30 PM) to late night (12:00 AM) that resulted in 1.8±0.6 hours of loss in total sleep time during R4 (P<0.01). After 30 days of fasting there were no important change in lipids, but a significant drop in serum iron from 17.7±1.6 μmol/L at BR to 13.1±1.4 μmol/L (P<0.01), suggesting a potential nutritional deficiency. Moreover, reduction in serum iron was associated with younger age (r=0.47, P=0.05) and lighter body weight (r=0.37, P=0.13). Dietary analysis showed that subjects consumed a high calorific diet deficient in fruits and vegetables during Ramadan that explains weight gain (+1.0±0.2 kg, P=0.001) and consequently longer sprint times on RSA test (+0.4±0.1 s, P=0.04) at R4 compared to baseline.
Conclusion: This study concludes that intermittent Ramadan fasting may have an undesirable impact on body composition, sleep patterns and nutritional habits in young schoolchildren. These results could be used to develop educational strategies to promote a healthy lifestyle in schoolchildren during Ramadan.
-
-
-
Combined Temperature and Altitude Challenges do not Exacerbate the Degree of Muscle Fatigue Despite Shorter Cycling Time to Exhaustion
Authors: Olivier Girard and Sébastien RacinaisAbstractThis study investigated the combined effect of environmental temperature [neutral (22°C/30%rH) vs. warm (35°C/40%rH)] and altitude challenge [sea level (FIO2 0.21) vs. reduced O2 content (FIO2 0.15)] on locomotor performance and the degree of end-exercise neuromuscular fatigue. Eleven physically active subjects cycled to exhaustion at constant workload (66% of their VO2max) in four different environmental conditions [Neutral/Sea level (Control), Warm/Sea level (Hot), Neutral/Reduced O2 content (Hypoxia) and Warm/Reduced O2 content (Hot+Hypoxia)]. The torque and EMG responses following electrical stimulation of the tibial nerve (plantar-flexion; soleus) were recorded before and 5 min after exercise. Time to exhaustion was reduced (P<0.05) in Hot (−35%) or Hypoxia (−36%) compared to Control, while Hot+Hypoxia (−51%) further decreased performance. There was no main effect of temperature or altitude on end-exercise core temperature (P=0.089 and P=0.070, respectively) and rating of perceived exertion (both P>0.05), nor any significant interaction. Reductions in maximal voluntary contraction (MVC) torque (−9%; P<0.001), voluntary activation (−4%; P<0.05) and peak twitch torque (−6%; P<0.05) from pre- to post-exercise were similar between all trials, independently of the environmental temperature or altitude. M-wave amplitude (at rest and during brief MVC) and RMS activity were reduced (P<0.05) in warm compared to neutral conditions, while altitude had no main effect on any measured parameters. Combining environmental temperature and altitude challenges further reduce cycling time to exhaustion but do not exacerbate the degree of end-exercise neuromuscular fatigue.
-
-
-
Neuromuscular Function Following Exercise-Heat Stress: Influence of Exercise Modality
More LessAbstractBackground: Exercise-induced hyperthermia is associated with a decrease in force production capacity during brief (<5 seconds) and sustained (>10 seconds) maximal voluntary isometric contractions (MVIC). A reduction in central nervous system drive to exercising muscles is suggested to mediate this decrement to prevent thermal injury. Until recently, the influence of exercise modality on neuromuscular function in the heat remained unclear. Two studies have since elucidated the role of constant load and self-paced exercise on force production capacity and voluntary activation during hyperthermia.
Methods: Study one evaluated neuromuscular function after a 40 km cycling time trial in hot (35°C) and cool conditions (20°C). In study two, muscle function was evaluated after passive heating via water immersion to a core temperature of 39.5°C and following constant load exercise to exhaustion at 60% of maximal oxygen uptake (38°C conditions). Prior to (control) and following each intervention, a sustained MVIC (20 s and 45 s, respectively for study 1 and 2) was performed to measure force production. Voluntary activation of the knee extensors was measured via percutaneous electrical stimulation at three intervals during each MVIC.
Results: Following self-paced exercise, mean force production decreased similarly in hot (15%) and cool (14%) conditions compared with control, despite a difference in core temperature of 0.8°C (P<0.001). A reduction in mean voluntary activation (P<0.05) accounted for ∼20% of the decrement in force. Interestingly, the extent of decline in voluntary activation was sustained for the duration of MVIC and did not progressively decrease. In the second study, mean force production was reduced following both interventions, but the magnitude of decline was more pronounced after exercise (P<0.05). As with study one, the decline in voluntary activation was similarly maintained (∼93%) following both interventions, with central fatigue accounting for <45% of the loss in force.
Conclusion: The loss of force production following exercise-induced hyperthermia appears to stem for both central and peripheral fatigue factors. Modality does not appear to influence neuromuscular function when exercise duration is similar and final core temperature is within ∼1°C. The combination of exercise and heat stress exacerbates the loss of force due to prior contractile activity.
-
-
-
A New Innovative Therapy for Sports Related Soft Tissue Injuries: Platelet-Rich Plasma (PRP)
Authors: Hans Tol, Bruce Hamilton and Hakim ChalabiAbstractIntroduction: Platelet-rich plasma (PRP) is the cellular component of plasma that settles after centrifugation, and contains numerous growth factors. There is increasing interest in the sports medicine about providing endogenous growth factors directly to the injury site, using autologous blood products such as PRP, to potentially facilitate healing and earlier return to. Despite this interest, and apparent widespread use, there is a lack of high-level evidence trials assessing the efficacy of PRP.
Systematic review: We performed a systematic review of the literature and included clinical studies on PRP injections for ligament, muscle or tendon injuries. A few randomized controlled clinical trials have assessed the efficacy of PRP injections and none have demonstrated scientific evidence of efficacy. Scientific studies should be performed to assess clinical indications, efficacy, and safety of PRP, and this will require appropriately powered randomized controlled trials with adequate and validated clinical and functional outcome measures and sound statistical analysis.
Original research: Our group recently studied the effects of a platelet-rich plasma injection in patients with chronic midportion Achilles tendinopathy at 1-year follow-up in a randomized controlled trial; Fifty-four patients, aged 18 to 70 years, with chronic tendinopathy were randomized to receive either a blinded injection containing platelet-rich plasma or saline (placebo group) in addition to an eccentric training program. The validated mean Victorian Institute of Sports Assessment-Achilles score improved in both the platelet-rich plasma group and the placebo group after 1 year. There was no significant difference in increase between both groups (adjusted between-group difference, 5.5; 95% confidence interval, −4.9 to 15.8, P = .292). This randomized controlled trial showed no clinical and ultrasonographic superiority of platelet-rich plasma injection over a placebo injection.
Overall conclusion: Based on the systematic review and original research, the potential risks involved with PRP are fortunately very low. PRP got the potentiality to facilitate healing and earlier return to sport after musculoskeletal injury, but benefits remain unproven to date and there is a need for high quality studies with a randomized design. Aspetar is currently performing a level I evidence research project into the benefits of PRP in muscle injuries.
-
-
-
Automated Marking of Sleep Spindles using Wavelet Packet Decomposition and Peak Tracking
Authors: Abdul Jaleel Palliyali, Reza Tafreshi, Beena Ahmed, Zurwa Khan and Hassan Al-HailAbstractSleep spindles, along with K-complexes are hallmarks of stage 2 non rapid eye movement (NREM) sleep EEG. Sleep spindles are of significant interest because they are associated with phenomena such as ‘stability’ of sleep, updating of knowledge with new memories, processing of sensorimotor and mnemonic information. Therefore, accurately marking their presence in sleep recordings is essential.
Accurate identification of spindles in EEG recordings has proved to be a time consuming task, even with the help of experts. Further, manual detection by different experts introduces disparity and biases due to inter-rater differences. Hence there is a crucial need for an automated detection algorithm.
The objective of this paper was to develop a robust algorithm for real-time automated spindle detection based on the wavelet packet decomposition. The developed algorithm replicates the marking methodology used by sleep specialists to identify spindles. Spindles are transient 11–16 Hz oscillations present in NREM with higher amplitude than the background delta waves. To identify the spindles the EEG data was divided into epochs from which appropriate features were extracted to differentiate spindles from the background EEG. The feature vectors used included the level of EOG activity, the quantity of significant peak-to-peak transitions, the wavelet packet energy (WPE) within the frequency band of interest (11–16 Hz) and the presence of K-complexes. EOG activity was tracked to identify NREM sleep sections. Spindles were marked as being present in those epochs in which the WPE and peak-to-peak activity were higher than predetermined thresholds. The thresholds were reduced on detection of K-complexes, mimicking manual scoring.
The accuracy of the developed algorithm was verified by comparing to the manual scoring performed by a sleep specialist on the EEG data. The results from the algorithm look promising with a good degree of agreement with the manual scoring. When run on 3 hours of EEG data with 52 manually scored spindles the algorithm successfully detected 42 of them (80.7%) and of the total 21,600 epochs analyzed 290 were falsely detected as containing spindles. It was also observed that the true detection rate increases on varying the thresholds although this introduces further false detections.
-
-
-
Electrocardiogram QRS Detection Using Temporal Correlation
Authors: Jongil Lim, Reza Tafreshi and Abdul JaleelAbstractMyocardial infarction (MI) is one of the most common sudden onset heart diseases. Early diagnosis of MI is essential for management and treatment initiation. Electrocardiogram (ECG), as a noninvasive electrical recording of the heart behavior is one of the most reliable diagnostic tools for identifying patients with suspected MI. The QRS complex is the major feature of an ECG. There have been many researches for QRS detection algorithms. However, the current QRS detection algorithms have high false detections due to various types of noise or disturbances and sudden changes in the QRS complex.
We propose a novel QRS detection algorithm based on the use of simple pattern matching techniques in order to increase the accuracy of QRS detection. The algorithm aims to achieve better detection by grouping different ECG waveforms into 5 fundamental groups and then proceeding towards correction of detections based on this classification.
ECG had to be first filtered for high frequency noise and drift in order to be diagnostically useful. The filtered ECG is classified into standard and nonstandard groups using parabolic fitting. The QRS detection is performed on these groups. The algorithm proceeded by re-classifying the waveforms into 5 fundamental types of ECG. It then improved the detections using temporal correlation between successive ECG beats for further corrections. After all the appropriate corrections, identical waveform types on each lead were presented. The efficiency of the algorithm was also calculated from its true detection rate. QRS detection algorithm was tested using 20 MI patient data from the PTB diagnostic ECG database.
The algorithm resulted in a true detection rate of 98.9%. Our experiment showed that 199 leads among the 220 leads in 20 data sets were successfully classified into the five major groups. This proved to be a key step towards improving the accuracy of the algorithm as most of the waveforms belong to these major groups. As expected, our results confirmed that typical ECG waveforms are composed of successive ECG beats of similar patterns with little variation from one ECG beat to another.
-
-
-
Posaconazole a Prophylactic Therapy in Cancer Patients: Analysis and Pharmacokinetics
Authors: Dalia Hamdy, Hajer El-Geed, Samah El-Salem and Manal ZaidanAbstractIntorduction: Posaconazole (PZ), an antifungal prophylactic therapy in hematologic cancer patients, was added to Al-Amal Hospital formulary in 2010. The objectives of this study are: 1. To identify the practice guidelines and pharmacokinetics information regarding PZ use in Qatar and worldwide. 2. To conduct a drug use evaluation (DUE) report at Al-Amal Hospital.
Methods: Literature review was conducted to answer the first objective. A retrospective DUE report was conducted to include 10 randomly chosen hematologic cancer patients who used PZ during the year 2010. Patients profiles were reviewed and data were collected into a pre-prepared collection sheet.
Results: PZ was approved for prophylaxis in hematologic cancer patients →13 years in USA, Canada, and Australia, →18 years in the European Union and >15 years in Qatar. PZ has low bioavailability that can be enhanced by co-administration of high fat meals and by dividing the total daily dose. Data regarding PZ TDM is controversial. PZ undergoes several drug-drug interactions. For example, co-administration of proton pump inhibitors may result in PZ sub-therapeutic levels. Co-administration of vincristine may result in higher neurological toxicity, mainly gastrointestinal problems, due to the inhibitory potency of PZ on cytochrome P450 enzymes. A patient receiving vincristine based chemotherapy protocol concurrently with PZ developed seizure. Another patient developed mild breakthrough fungal infection while on PZ prophylactically.
Conclusion: The PZ regulations in Qatar are similar to the worldwide recommendations. The PZ practice in Al-Amal hospital abides by the regulations. Possible serious PZ drug-drug interactions, seizures, in hematologic cancer patients should be highlighted and carefully monitored.
-
-
-
Deformation of Imbedded Blood Vessels Due to Uniform Pressure
More LessAbstractWe consider the deformation of a blood vessel imbedded in soft tissue that is surrounded by a rigid structure. The vessel deforms when the difference between its external and internal pressures exceeds a certain value. To represent the deformation, we use a physical model consisting of two concentric cylinders tethered by numerous nonlinear springs representing the biological tissues surrounding the vessel (see Figure A ); the outer cylinder is taken to be rigid while the inner one is taken to be thin-walled, elastic and free to deform. We formulate the governing equations, and develop suitable numerical techniques for calculating the shape of the cross section of a deformed vessel and the blood flow rate through it (see Figure B ). The dependence of the deformation and the blood flow rate on the elastic parameters is shown (numerically) to be a convex function of the elastic parameters. This allows the formation of a well behaved “Inverse Problem,” where the elasticity of the surrounding soft tissue can be detected from the (measurable) data consisting of: pressure, cross sectional shape and blood flow rate. Since testing the elasticity of human tissue can only be done in vivo, and since such information is important as aid in the diagnosis of some diseases, the present study serves as an advancement in the non-invasive testing of the elasticity of certain soft tissues in the human body.
-
-
-
Elevation of Alpha Acid Glycoprotein (AGP) does not Correlate with the Resistance of Chronic Myeloid Leukaemia (CML) to Imatinib Mesylate (IM)
Authors: Nader Izz Eddin Al-Dewik, Hanadi El Ayoubi, Andy Jewell and Hisham MorsiAbstractBackground: Despite the efficacy of IM in treating CML, high degree of resistance has already been noted.
AGP may reduce drug efficacy through its ability to interact with IM.
Objectives: Could the level of AGP be correlated with CML resistance/response to treatment and if it could be employed as a biological marker for CML resistance.
Methods: 25 CML patients were investigated for AGP level, serum samples were analysed to determine AGP level. Immunoturbidimetric assay is based on the formation of a precipitate of AGP with a specific antiserum. The mean, variance and significant difference between the means were determined using Student's t test and significance was determined when p value was ←0.05.
Results: Over 2 years a total of 89 serum samples were collected from 25 CML patients treated at Al Amal hospital in Qatar. Ten samples from 10 healthy volunteers were collected as a control group.
9 patients presented with CML at Chronic Phase (CP), 5 at Accelerated Phase (AP), 6 patients progressed while on treatment and five more patients were undergoing treatment and were at Complete Haematological Remission (CHR) at time of sample collection.
The mean AGP levels were 1.2 (±0.3), 1.61 (±0.4), 1.01 (±0.08), 1.07 (±0.09), and 0.72 (±0.04) for CP, AP, Poor Responders, CHR and controls respectively.
The mean AGP level for the control group was significantly lower when compared with any of the diseased group.
The significant differences amongst CP, AP, Poor Responders patients, CHR patients and control group were (p) 0.001, 0.03, 0.003, 0.005 respectively.
On the other hand, among these different CML groups there was no significant difference in AGP levels even when correlated with white blood cells, platelets and/or basophiles.
However, there was significant difference between CP and AP patients (P value 0.002) when AGP was correlated with WBC's.
Nonetheless AGP level could not be correlated with course of disease.
Conclusions: The noticed resistance in our CML patient population could not be correlated with AGP levels, as patients were responding or resisting treatment without any recognisable pattern of AGP; even when patients achieved CHR they might still had elevated AGP levels.
-
-
-
BCR-ABL Kinase Point Mutations don't Correlate with the Resistance of Chronic Myelocytic Leukemia (CML) to Imatinib Mesylate (IM); A Study on CML Patient Population in Qatar
Authors: Nader Izz Eddin Al-Dewik, Hanadi El Ayoubi, Andy Jewell and Hisham MorsiAbstractBackground: More than 45% of CML patients in Qatar resist the first line of treatment; Internationally, certain ABL mutations are the most common cause of IM resistance
Objectives: To screen for BCR-ABL kinase mutations in CML patients treated in Qatar and to study if point mutations can be correlated with resistance to treatment.
Methods: Peripheral Blood (PB) and Bone Marrow (BM) samples were collected from 25 patients; total RNA was extracted and cDNA was produced via RT-PCR with special precautions to avoid amplification of wild type ABL and cover the whole ABL kinase domain.
Results: Over a period of three years, 39 PB and 30 BM samples from 25 patients receiving IM were studied for ABL mutations prior to treatment and at time of resistance.
For all 25 patients we noticed three nucleotide changes at A1258G, A1426G and A1739G of ABL (GenBank accession no. M14752). However, when we compared these changes with major SNP databases (NCBI, ENSEMBL), these changes were described by others as ancestral allele that does not convey any pathological changes.
Although, we found no evidence of ABL point mutations in patients at time of resistance, in one patient, who had complex cytogenetic abnormalities, we noticed a transient insertion of three nucleotides (AAG) at position 1432 which added an amino acid Lysine356 at time of resistance.
This patient was shifted to dasatinib and achieved major molecular response after three months of treatment.
Conclusions: Due to high rate of resistance of CML to IM, we tested our patients for BCR-ABL points mutations and could not reveal any of the described ABL domain mutations.
The significance of the insertion of the three nucleotides is still to be determined.
However, it must be kept in mind that direct sequencing has a limited sensitivity and might miss a low level mutation (less than 30% of the total ABL domain).
An alternative approach such as High Resolution Melting (HRM) technology accompanied with sequencing might be needed to detect and quantify low level mutations.
-
-
-
Vitamin D Status in Pregnant Women and their Babies in Qatar
Authors: Samar Al-Emadi and Mohammed HammoudehAbstractBack ground and Objectives: Vitamin D deficiency is very common in pregnant women and the current guidelines for vitamin D intake during pregnancy of 200-400 IU has been challenged recently .We conducted this study to determine the prevalence of Vitamin D deficiency among pregnant women and to evaluate the safety weekly oral 50,000 IU vitamin D supplementation for the mother and the newborn.
Setting and design: prospective study, at Hamad Medical Corporation, outpatient unit and delivery room.
Patients and Methods: 97 pregnant women were recruited in their first trimester between December 2007 and March 2010. Weekly oral vitamin D 50,000 IU were prescribed after an initial testing for serum level of 25-hydroxyvitaminD, parathyroid hormone, calcium, phosphorus, total protein and albumin. Other multivitamins supplementations were allowed during pregnancy. The same tests were repeated at each trimester. Umbilical cords Vitamin D levels were determined at birth. .
Results: Out of 97, 8 patients dropped out from the study for several reasons, and 19 patients had miscarriages.
Data were available for 97 women in the first trimester, 78 women in the second trimester and 61 women in the third trimester .The mean level of vitamin D in the first trimester and prior to starting vitamin D supplementation was 17.15ng/ml, 29.08 ng/ml in the second trimester, 27.3 ng/ml in third trimester and 22.36 ng/ml in newborns. There were no toxic levels of vitamin D in any of the women at second or third trimester or in the newborns. The mean levels of vitamin D in the second and third trimester were not significantly different in the women who were taking multivitamin supplementation versus those who were not.
Conclusion: Weekly dose of 50,000 vitamin D during pregnancy is safe in our population, maintains acceptable vitamin D level during pregnancy and the newborns' vitamin D level correlates with the mother's levels.
-
-
-
The Outcome of Severe Traumatic Brain Injury in Children in Qatar: Six year study
AbstractBackground: Traumatic brain injuries (TBIs) remain as an important public health problem in most developed and developing countries and may also result in temporary or permanent disability.
Objective: The aim of this study was to determine the incidence pattern of the burden of severe TBIs among young children in Qatar and to suggest practical prevention policies that can be implemented in Qatar.
Methods: The study was conducted among children aged 14 years or less at the Children Rehabilitation Unit, Paediatric Department, Hamad General Hospital. Severity of TBI was assessed by Glasgow Coma Scale (GCS).
Results: This study based on 65 children suffering from severe traumatic brain injury from January 2002 to December 2008, 12 of them died within the first month of admission in paediatric intensive care unit. The predominant gender was male (73.8 %), non-Qatari form 50.8%. In our study predominant mechanisms of injury were road traffic accident (84.6%), then falls (10.8%), other causes like head trauma by roof fan blade (4.6%), followed by sports and recreation injuries. Among our patients 43.1 % had spasticity, 33.8% experienced posttraumatic epilepsy. The current study revealed that 24.6 % had communication disorder, 26.2 % had poor cognition, 24.6% had hemiplegia, 18.5 % had abnormal behavior and 15.4 % had a vegetative state. Nearly all the patients (98.5%) required physiotherapy and occupational therapy, 50.8% of them required speech therapy and swallowing assessment, 47.7 % required braces either ankle foot orthosis or hands splints, 16.9 % required behavior therapy, whereas Botox injection was used in 60% of the spastic patient. Finally, the incidence of TBIs from road traffic crashes and injuries in Qatar are increasing significantly compared to the other developing and developed countries.
Conclusion: The present study findings provided an overview of severeTBI in Qatar which mostly related to the road traffic crashes and injuries. Special efforts should be made to reduce further motor vehicle crashes and injuries involving young people and welfare programs are also needed to limit the risk of TBI.
-
-
-
Role of Homocysteine Measurement for Early Diagnosis of Vitamin B12 Deficiency in the First Days of Life
Authors: Tawfeg Ben-Omran, Noora Shalbik, Hongying Gan-schreier, Ghassan Abdoh, Rehab Ali and Georg HoffmannAbstractBackground: Vitamin B12 (vit B12) deficiency is one of the major causes of megaloblastic anaemia and should be avoided as early as possible since a supplementation of mother and child can prevent neurological symptoms of the baby. Furthermore, the neurological symptoms of affected children are (partially) reversible. Elevated methylmalonic acid in urine and homocysteine (Hcy) in plasma are sensitive indicators. In the State of Qatar, extended newborn screening of classical homocystinuria was realized for all 73,994 neonates in last 4.5 years. Newborns with slightly elevated Hcy levels in dried blood spots (DBS) were followed up with regard to possible vit B12 deficiency. In addition, the propionylcarnitine (C3) levels were analysed.
Methods: Determination of Hcy in DBS was performed using liquid chromatography electrospray tandem mass spectrometry. C3 levels were obtained from general newborn screening. The vit B12 levels in plasma were analysed spectrophotometrically.
Results: In all, 117 cases with mildly elevated Hcy levels were found. 65 were diagnosed with vit B12 deficiency. Only 9 of these 65 newborns had abnormal C3 levels. No correlation was found in this group between Hcy and C3 levels.
Conclusion: Extended neonatal screening of Hcy is a useful tool for early diagnosis and treatment of vit B12 deficiency.
-
-
-
Partial Analysis of Olfactory Receptor Subgenome in the Arabian Camel
Authors: Atef Khalaf Sayed, Jilian Rowe, Karsten Suhre and Benjamin ShykindAbstractMany animals have evolved mechanisms to withstand the harsh desert environment, characterized by extreme high temperatures and scarce water supplies. The Arabian Camel, and the Arabian Oryx are valued economically and culturally. These animals can survive for several days without food or water. As the Arabian peninsula undergoes rapid and vast industrial changes, it is increasingly important to understand the biological aspects of these animals.
From the unicellular microbes to the sophisticated multi-cellular animals, sensing the chemical composition of the surrounding environment is essential for survival. The vertebrate chemosensory receptors genes, which are members of the seven transmembrane -helical G-protein coupled receptors (GPCRs), are encoded by six different multigene families. Four of these genes encode receptor proteins for sensing odors. The olfactory (odorant) receptors (ORs) are predominately expressed in the sensory neurons of the main olfactory epithelium, and can sense either water-soluble (class I) or volatile (class II) molecules. Furthermore, certain OR genes are expressed in non-olfactory tissues, such as brain, kidney, testis, and placenta.
Being adapted to very harsh conditions with elevated temperature, scarce water supply and limited vegetation, we hypothesize that desert animals have evolved the ability to detect water either from volatiles liberated by water in the environment or through the blooming of short-lived vegetation, via their olfactory systems.
To explore this possibility we have undertaken the study of the Camel OR genes. We identified approximately one hundred candidate OR genes, all of which are ortholgos to OR genes in other mammals and most closely related to those of the Equus caballus. Preliminary analysis revealed an enrichment in OR gene family 2/13, found in the highest proportion in aquatic animals, as compared to other mammals. This finding provides the intriguing suggestion that desert animals have evolved specific OR genes to adapt to the desert ecosystem.
We are currently working to identify the complete OR gene repertoire in the Camel, and to identify and characterize the OR subgenome in other desert animals such as the Arabian Oryx.
-
-
-
Does Number of Ports affect Outcomes in Patients Undergoing Laparoscopic Pyloromyotomy? Retrospective Chart-Review Study
Authors: Tariq O Abbas and Adel IsmailAbstractBackground: Although open Ramsted's pyloromyotomy is the gold standard for the surgical management of infantile hypertrophic pyloric stenosis, laparoscopic pyloromyotomy has been found highly successful. Various factors, however, can affect the outcomes of surgical interventions in these patients. We observed a relationship between the number of ports used and outcome in patients undergoing laparoscopic pyloromyotomies.
Method: We retrospectively assessed the medical records of selected group of patients who underwent laparoscopic pyloromyotomy in our institution. Factors analyzed included operation time, length of hospital stay, postoperative complications, and time to postoperative full feeding.
Results: We observed failure of myotomy in both two patients who underwent laparoscopic pyloromyotomy using only two working ports compared to successful myotomies in the remaining patients.
Conclusion: Laparoscopy provides good results in terms of intraoperative exposure and cosmesis. However, standardized surgical technique with two working ports is advisable and this can trigger further research to be ascertained.
-
-
-
High Resistance Rate of Chronic Myeloid Leukaemia (CML) to Imatinib Myselate (IM) Might be Related to Protein Tyrosine Phosphatase Receptor Type Gamma (PTPRG) Down-Regulation
AbstractBackground: CML is the most common myeloproliferative disease observed among adults, its 1st line of treatment is IM with a response rate ranging between 55 – 90%. In Qatar the resistance rate is higher than 45%. Our collaborators in Italy recently reported on the relation between CML and PTPRG.
Methods: One cohort of patients (n=25, period=3years) receiving Imatinib was studied for haematological, cytogenetic molecular and biochemical abnormalities.
Our collaborators in Italy examined different CML cell lines and an independent cohort of patients for the level of expression of PTPRG using QPCR, clonogenic assays, methylation-specific PCR, flow cytometry and western blotting.
Results: Our team reported previously on the high rate of resistance of CML to IM (45%). During this forum the team is further reporting on the possible underlying mechanisms behind this resistance (see Al-Dewik et al at this forum) Despite a few positive findings, no pattern could be identified to delineate a significant underlying mechanism.
Our collaborators in Italy, identified that down-regulation of PTPRG increased colony formation in the PTPRG+ve megakaryocytic MEG-01 and LAMA-84, but had no effect in the PTPRG-ve K562 and KYO-1.
Its over-expression had an oncosuppressive effect in all four cell lines and is associated with inhibition of BCR/ABL-dependent signalling. PTPRG was down-regulated at the mRNA and protein levels in CML patients in both PB and BM, including CD34+ cells, and is re-expressed following molecular remission of disease.
This re-expression was associated with loss of methylation of a CpG island of PTPRG promoter in 55% patients. In K562 cells, the hypomethylating agent 5-aza-2’-deoxycytidine induced PTPRG expression and caused inhibition of colony formation that was partially reverted by antisense-mediated down-regulation of PTPRG expression.
Conclusions: Although this study was done on 2 independent patient populations, it suggests that in CML populations with high resistance rate it might be worth examining the PTPRG expression level and correlate it with the pattern of resistance. Our group has secured 3 years funding from QNRF (NPRP 4-157-3-052) to investigate PTPRG signalling in CML, including the study of a possible link among the high CML resistance and the PTPRG expression levels.
-
-
-
Development of A Wearable and WBAN-Based Vital Signs Monitoring System for Low-cost Personal Healthcare in Qatar
Authors: Eng Hock Tay and Dagang GuoAbstractPopulation aging is a worldwide phenomenon, but its impact on Qatar is unique. The proposed system aims at comprehensive and integrated vital signs (ECG, Saturation of Arterial Oxygen (SpO2), BP and Heart rate (HR)) monitoring using a wearable sensor platform without professional involvements or interfering the elderly's everyday activities. A novel wireless physiological sensor node with single highly-integrated board has been specifically designed and fabricated ( Fig.1(a) ). The new board comprises of a MCU, ECG analog front-end, LED driver and brightness adjustment circuit for photoplethysmograph (PPG) measurement, a CC2420 chip for wireless communication and a FTDI FT232RL chip for MCU programming and real-time debugging. A miniaturized wireless gateway was also designed ( Fig.1(b) ) to wirelessly receive the data from sensor node and further relay to the PC for ongoing research on ECG denoising and arrhythmia classification.
A novel MEMS-based electrode has been designed and fabricated for ECG measurement as shown in Fig.2 . Compared with conventional ECG electrodes, micromachined electrode is more comfortable; no direct contact of gel with the human skin and imposes no side effects to human for continuous and long term measurement. A unique characteristic feature of the proposed electrode is that the microneedle array is made of heavily doped silicon, which is electrically conductive and eliminates the requirement to dope Ag/AgCl or metal layer on the microneedles for electric contact. The microneedles can directly pierce through the outer skin surface, lowering the electrode-skin-electrode impedance (ESEI) and eliminating the need for skin preparation which, is prerequisite for wet electrode. For long-term monitoring, mechanical failure of micro-needles may accidentally happen due to the axial loading during insertion process or transverse loading during the measurement. As a result, the broken silicon needles will become debris in the skin, which attracts healthy concern for the user. Therefore, critical buckling loads for fabricated micro-needle were investigated using both theoretical estimation and ANSYS simulation. The results show that the critical buckling load is much larger than theoretical insertion force thereby the buckling problem will not occur during the insertion process.
This work is supported by Qatar National Research Funding (QNRF) under the grant NPRP 09-292-2-113.
-
-
-
Pharmacovigilance in the Middle East
Authors: Kerry Wilbur, Amna Fadul and Hala SonallahAbstractBackground: The importance of countries to support their own national pharmacovigilance cannot be understated: citizens may have unique ethnicities, traditions, and diets influencing reaction to medication; alternate brands of therapy may be imported or manufactured and differ in ingredients or production processes; ADRs and toxicities associated with traditional and herbal remedies also need to be monitored. The objective of this study is to inventory national pharmacovigilance systems in place in the Middle East region.
Methods: The Uppsala Monitoring Center Assessment of Country Pharmacovigilance Situation (February 2008) was adapted and translated into Arabic. Survey domains pertain to general program overview; information technology support; suspected ADR reporting and subsequent data use; pharmacovigilance activity and advocacy. A comprehensive search was conducted to determine the existence of a governing body responsible for medication safety in 13 Arabic speaking Middle Eastern countries. Surveys were emailed to the head of the identified centres, with follow-up messages and telephone calls subsequently made as necessary.
Results: Data for 10 countries was obtained: representatives from two countries did not respond (Lebanon, Syria). Six described formal national pharmacovigilance programs (Egypt, Iraq, Jordan, Oman, Saudi Arabia, and the UAE), while five (Bahrain, Kuwait, Palestine, Qatar, Yemen) reported no active program or designated center. All active programs were recently formed (< 5 years). The majority (83%) are government funded and two (33%) receive suspected ADR reports and offer drug information services. Most (83%) welcomed reports from a wide variety of health professionals, as well as from the public. Sixty-seven percent facilitated submission to the centre by email, but none directly through a web-based platform. All used the information for drug regulatory purposes but only 2 (33%) reported dissemination of safety information to the public.
Conclusion: This is the first comprehensive review of the status of pharmacovigilance in the Middle East. While a number of countries participate in suspected ADR reporting activities, an estimated population of 30–50 million is without formal domestic programs. Technology must be exploited to ease spontaneous reporting and subsequent data management. Existing mechanisms for regional collaboration should be advanced so experience from model programs can be shared.
-
-
-
The Assignment of the Gene Responsible for Congenital Cataract and Micropthalmia to the Pericentromeric Region of the X Chromosome and Examination of Candidate Genes
Authors: Vasiliki Chini, Diana Mina, Jamil Alami and Hatem El-ShantiAbstractBackground: X-linked diseases are single gene disorders that are due to the presence of mutations in genes that reside on the X chromosome. X-linked recessive disorders are predicted from the family structure, where only boys are affected and there is no father to son transmission of the mutant allele. Heterozygous females are usually non-symptomatic carriers but can manifest a milder form of the disease. The identification of the genetic defect in X-linked disorders facilitates the diagnosis of affected individuals, aid in providing informative counseling and may help in prenatal diagnosis. Objectives: The study aims at mapping and identification of one gene responsible for congenital cataract and micropthalmia in a three-generation family.
Methods: We recruited 12 members of a family with a clear X-linked pattern of inheritance with three affected males, all showing congenital cataracts and microphthalmia. Gene mapping was attempted using a set of microsatellite markers selected to cover the whole X chromosome. Haplotypes were generated for all genotypes and the haplotypes were examined for alleles shared by the affected males and not shared by the unaffected males. Once the region of linkage was identified, we examined a few candidate genes by mutation analysis by resequencing in forward and reverse of one affected individual, one obligate carrier and one unrelated normal control. Candidate genes were chosen from the human genome public databases and were selected based on the possibility that they play a role in eye development or are expressed in fetal eyes.
Results: The region of linkage is a 50 Mb in the pericentromeric region of the X chromosome (Xp21.1-q21.2). The candidate genes ARR3, DACH2 and BCOR were resequenced in forward and reverse, but no variations were detected.
Conclusions: We were capable of mapping the gene responsible for congenital cataracts and microphthalmia to the pericentromeric region of the X chromosome. We examined 3 candidate genes but no variations were detected. Currently, we are examining other candidate genes. If no mutant alleles are identified by this candidate gene approach, we will proceed by performing whole exome sequencing of the X chromosome (after enrichment) utilizing the next generation sequencing technology.
-
-
-
The Spectrum of MEFV Mutations in an Arabic Cohort
Authors: Abdulghani Khilan, Rowaida Taha, Dina Ahram, Suhail Ayesh, Jamil Alami and Hatem El-ShantiAbstractBackground: Autoinflammatory diseases are a group of disorders characterized by seemingly unprovoked inflammation in the absence of high-titer autoantibodies or antigen-specific T cells. Familial Mediterranean Fever (FMF) is an autosomal recessive disorder. It is characterized by recurrent self-limiting episodes of fever and painful polyserositis. FMF is prevalent in specific ethnic groups—namely, non-Ashkenazi Jews, Armenians, Turks, and Arabs. There seems to be a distinctive clinical picture in Arab patients with FMF, and the range and distribution of MEFV mutations is different from that noted in other commonly affected ethnic groups.
Objectives: The aim of this study is to delineate the spectrum and distribution of MEFV mutations amongst an Arabic FMF patient cohort and to assist the genotype-phenotype correlation in these patients.
Methods: We have collected DNA samples from 188 FMF patients (from Qatar, Jordan and Palestine) who have been clinically diagnosed with FMF, according to international and validated diagnostic criteria. We have designed primers to cover the entire genomic region of MEFV. As a first tier, mutation detection is done by resequencing the entire coding sequence and splice sites; as a second tier the rest of the genomic region including the promoter are resequenced.
Results: In the first tier, we have identified 191 out of 376 mutant alleles (50%) by resequencing the entire coding region and splice sites of MEFV. In addition, resequencing of the entire genomic region of 100 patients who had only one identifiable allele was carried resulting in the identification of specific haplotypes and we are currently investigating the phenotypic significance of these haplotypes.
Conclusions: The spectrum of MEFV mutations in Arabs seems different from other ethnic groups commonly affected by FMF. The fraction of the identifiable disease causing alleles is the lowest amongst the commonly affected ethnic groups. The results of the genomic resequencing of MEFV may provide some insight into the role of non-coding sequences and may explain the molecular pathology of FMF. Thereby, we are currently working on the development of a low cost and high throughput technique to facilitate the resequencing of the entire genomic sequence of MEFV using Next Generation sequencing technology.
-
-
-
Discovery of a Probable Gene Mutation Causing Mental Retardation, Microsomia, and Signs of Skeletal Dysplasia in an Arab Family with a Previously Undelineated Autosomal Recessive Disorder
Authors: Mazen Naim Osman, Yasser Al-Sarraj, Ghing Billedo, Samiha Zaineddin, Hatem El-Shanti and Jamil AlamiAbstractBackground: Autosomal recessive diseases are considered as a major group of single-gene disorders among Arab population. We have recruited a family with three siblings with a mental retardation (MR) syndrome who were born to consanguineous Qatari parents. The clinical problems comprised significant mental retardation, microsomia, signs of skeletal dysplasia, and thoracolumbar kyphosis. The oldest patient suffers also from epileptic seizures. Also, the parents and the other three of their six children are healthy. Causative genes and mapping strategy focused on large genomic regions demonstrating homozygosity in all of the affected individuals.
Objectives: Our goal is to identify the genetic causes of undelineated autosomal recessive disorders among Arab families.
Methods: Whole genome genotyping has been performed by (Illumina 300Kb SNPs). Followed by homozygosity mapping and linkage analysis. In addition, targeted resequencing of the candidate genes within the linked homozygous loci was performed.
Results and conclusions: Homozygosity mapping revealed a single large shared region of homozygous SNPs on the long arm of chromosome 4 flanked by rs28419770 (4q13.1) and rs4105671 (4q21.23). This block contains more than 120 genes, none of which has been implicated in MR or any of the above mentioned phenotype so far. Sequencing of candidate genes within the region revealed two novel missense variations in FRAS1 gene; an Arg3099Gln and Thr3149Met. Both variations were found in the three affected siblings in homozygous status, while the parents were heterozygous. Furthermore, these two variations have not been found in 140 individual controls in homozygous pattern, however, a heterozygous pattern of variations were found in three individuals only. Our future plan will be doing the whole exome sequencing for the shared region using next generation sequencing platform.
-
-
-
Characterization of the LPIN2 Gene and its Protein and Examination of its Role in Psoriasis
Authors: Yasmin Walid Abu Aqel, Fatma Abdallah, Hanan Abu Nada, Mazen Osman, Jamil Alami and Hatem El-ShantiAbstractPsoriasis is a chronic inflammatory skin disease posing a considerable worldwide health problem due to its high prevalence, associated morbidity and high health-care costs. It is a multifactorial “complex” disorder, with compelling evidence for a genetic predisposition. On the other hand, Majeed syndrome, a Mendelian disorder of bone and skin inflammation is caused by homozygous mutations in LPIN2. Many observations have implicated LPIN2 in the genetic etiology of psoriasis, including its position in a psoriasis locus. We identified several non-synonymous SNPs within the LPIN2 in patients with psoriasis that are not present in healthy controls.
We hypothesize that the variations in LPIN2 play a role in the susceptibility to development of psoriasis and that LPIN2 is the psoriasis susceptibility locus on 18p. We aim to examine this hypothesis by examining the properties of the wild type and mutant proteins, as well as examining any difference in function between the wild type and mutants.
We have obtained custom synthesized cDNA clones encoding the full Lipin2 wild type protein and the six identified mutant proteins (p.K387E, p.S734L, p.A331S, p.L504F, p.P348L, p.E601K). The cDNA clones were subcloned into pYES2 vector for expression in yeast cells (Saccharomyces cerevisiae). Each construct was transformed into Saccharomyces cerevisiae for protein expression. The analysis utilizes SDS Gel Electophoresis and Western Blot.
The DNA analysis indicates that each fragment has been correctly cloned into the pYES2 vector. The analyses using SDS Gel Electrophoresis and Western Blot indicate that the Wild type and p.K387E are successfully expressed in S. cerevisiae while p.S734L is expressed in S. cerevisiae but at a lower level. Expression experiments are being done on the 4 remaining mutant proteins.
We were successful in artificially expressing the human Lipin2 protein in its different forms in yeast cells. We are currently optimizing the conditions to produce substantial amounts of the proteins to be studied by Circular dichroism to determine the folding patterns. Other methods will be approached to study the function.
-
-
-
Prevalence of Autism Spectrum Disorders in Qatar
More LessAbstractPrevalence rate of autism-spectrum disorders (ASD) in Qatar is uncertain, and speculation that their incidence is increasing continues to cause concern. Although the apparent increased prevalence of autism may reflect improved detection and recognition of autism and its variants. No comprehensive survey has been done to estimate the prevalence of autism in Qatar.
The target population for this study is children aged 3 through 18 years whose parents resided in Qatar.
Children with ASD in Qatar going to be identified using a two-phase process.
In Phase 1, children from a representative sample of all primary schools in Qatar going to be preliminarily screened using Social Communications Questionnaires, and those who are suspected to have ASD will be approached through phase 2. Review of records of children with possible ASD from the following institutions:
1). Shafallah Center for children with special needs records, which includes all children with preliminary diagnosis of ASD, who attends special classes for autism.
2). Other centers and school which has similar facilities.
3). Records from the Supreme Council of Health, Hamad Medical Corporation, and any other health centers.
In Phase 2, clinical evaluation is conducted by a developmental psychologist, and/or paediatrician. It includes a medical, developmental, and behavioural history; a standard physical and neurologic examination, In addition, the Autism Diagnostic Interview (ADI-R), and Autism Diagnostic Observation Schedule-G (ADOS-G) will be administered.
Preliminary analysis of 179 subjects showed the highest prevalence among age group 7–14 years (61%).
Male/female ratio was 82% /18%, which is around 5/1. Further works needed to calculate the total prevalence rate. Obtaining a reliable estimate is important in planning for providing the best health care and educational services needed to improve the overall outcome of autism.
-
-
-
Gene Identification in Autosomal Recessive Forms of Familial Epilepsy
AbstractBackground: Epilepsy is a disorder of the brain characterized by an enduring predisposition to generate recurrent epileptic seizures, as well as, the neurobiologic, cognitive, psychological, and social consequences. The estimated proportion of the general population with active epilepsy at a given time is 10 per 1,000 people. The cause of epilepsy remains unknown in a substantial proportion of affected individuals. There is considerable evidence of the role of genetics in the predisposition to epileptic seizures. There is a need to identify the genes that predispose to epilepsy.
Objectives: The objective of this study is to attempt at the identification of the genes responsible for specific forms of familial epilepsy by using homozygosity mapping and mutation detection analyses.
Methods: In this study we recruit families in which epilepsy segregates in a suggested autosomal recessive pattern. Homozygosity mapping is applied after genotyping with 370K SNP chips (Illumina platform). The gene identification is performed by candidate gene approach and direct resequencing.
Results: We recruited a consanguineous two-generation family with five affected individuals from two related sibships. All patients were clinically diagnosed and the clinical picture delineated. The gene responsible for the epilepsy in this family has been mapped to a 10 MB region on chromosome 11. At least 10 candidate genes, including SHANK2, SYT12, CFL1 and KCNK4 were examined for mutations but no specific mutations were identified as of yet.
Conclusion: Further examination of other candidate genes is ongoing. However, genomic sequencing utilizing next-generation sequencing technology is in progress.
-
-
-
Study of Undelineated Autosomal Recessive Disorder among Arabs
Authors: Jamil Al-Alami, Yasser Al-Sarraj, Yosra Bejaoui, Mazen Osman, Eman Abuazab, Mohammed El-Dow and Hatem El-ShantiAbstractBackground: The number of genes identified to be responsible for autosomal dominant genetic conditions far exceeds those identified for autosomal recessive conditions. This is expected because autosomal recessive disorders are rare and a single large family or a large number of smaller families are needed for gene mapping and identification. However, this hurdle can be overcome by homozygosity mapping utilizing inbred families.
Objectives: The aim of this study is to map loci and identify genes that play a role in autosomal recessive disorders among Arab families and to examine their role with the final aim of outlining novel genes and pathways. The investigation capitalizes on utilizing large inbred families, as well as smaller inbred families, by employing homozygosity approaches for mapping the etiologic genes.
Methods: It includes the recruitment of families and obtaining detailed clinical, genealogical and genotypic data. Families with pedigrees that provide suggestive evidence of autosomal recessive mode of inheritance and with consanguinity are selected. Homozygosity mapping is done by genome wide SNP genotyping using dense chips, followed by linkage analysis.
Results: We performed homozygosity mapping on one recruited family to seek a region of homozygosity shared by affected individuals. This family includes eight individuals from 4 related sibships in an extended Palestinian family who suffer from congenital cataract. Homozygosity mapping revealed a region flanked by rs4276160 (3p22.1) and rs749512 (3p21.31) on the short arm of chromosme 3. This interval contains 92 genes, none of which has been implicated in eye disease. Further investigation for this family is currently underway using whole exome sequencing to identify the causative gene mutation. Other examples will be presented.
Conclusion: Homozygosity mapping utilizing inbred families is a very powerful tool for gene mapping in autosomal recessive families. However, the region of linkage is usually big and contains a large number of genes, which is prohibitive with the classic technology. Genomic approaches such as whole exome sequencing is yet another powerful tool to overcome this hurdle.
-
-
-
Breast Cancer Screening Amongst Arabic Women Living in the State of Qatar: Preliminary Results of the Cross-Sectional Community Based Survey
AbstractBackground: Breast cancer is the most common cancer among women in Qatar, incidence rate is rising and it is often diagnosed at advanced stages. Early detection of breast cancer through regular screening activities has been found to decrease morbidity and mortality rates. Although research on breast cancer screening in the Middle East is scarce, low levels of knowledge and poor participation rates have been found to act as barriers towards breast cancer screening activities such as breast self-examination, clinical breast examination and mammography. Various other barriers have been described in the literature. Identification of these potential barriers and facilitators is urgently needed in order to develop culturally appropriate interventions aiming to improve awareness and breast cancer screening participation rates.
Objectives: A three-phase research program for which the goals are to (1) Understand breast health issues in Qatar; (2) Identify and implement strategies that assist Arabic women's participation in breast cancer screening activities (3) Evaluate, facilitate and sustain these strategies.
Methods: In Phase 1 two studies are conducted. Study 1: this quantitative study examines data from a convenience sample of 1063 Arabic women in Qatar on a cross sectional community based survey. Face to face interviews are used to investigate knowledge, attitudes, practices, barriers and facilitators regarding breast cancer screening activities. Study 2: Using an ethnographic qualitative methodology, this study will capture the complexity and diversity of reasons of health behaviour choices on a purposive sample of 50 women, 50 men and 30 health care providers.
Results: Preliminary results from study 1 will be presented. These will include: participation rates of breast cancer screening activities of Arabic women in Qatar such as breast self-examination, clinical breast examination and mammography; levels of knowledge of breast cancer and its screening; identified barriers and facilitators to breast cancer screening as experienced by these women.
Conclusion: Combined results will enable development of culturally appropriate intervention strategies to raise awareness and participation rate in breast cancer screening among Arabic women living in Qatar and the Gulf region.
-
-
-
Factors Influencing Lifestyle Risk Behaviours Associated with Cardiovascular Diseases amongst Qatari Women
AbstractIn Qatar, cardiovascular diseases are the leading cause of mortality and morbidity.
Cardiovascular diseases can be prevented and controlled by modifying lifestyle risk behaviours such as physical inactivity, unhealthy diet and smoking. Obesity as the result of physical inactivity and unhealthy diet raises the risk of heart diseases. Studies show that 62.6% of Qatari women were overweight and the prevalence of overweight is high among adult females with 80% of women 30 years and over. Qatar World Health Survey in 2006 shows that only 40% of Qatari women participated regularly in sports or other physical activities. Furthermore, waterpipe smoking is increasing across the Eastern Mediterranean.
Funded by the Qatar National Research Fund, the ultimate goal of this study was to find ways to effectively promote cardiovascular/coronary artery disease prevention and management activities among Qatari women (citizen and resident Arabic women) by exploring factors affecting the ways in which Qatari women participate in physical activities, healthy diet and smoking.
An exploratory qualitative research approach was used in this study, with a semi-structured questionnaire using open ended questions to gather data. Individual in-depth interviews were conducted with 50 Arabic women who are 30 years and over, have confirmed diagnosis of CVD/coronary artery diseases to investigate factors influence lifestyle risk behaviours associated with cardiovascular diseases amongst Qatari women (citizen and resident Arabic women).
The study's results show that social support networks; cultural beliefs, values, practices, and religion; rapid economic growth; changing environmental and social conditions influence women's participation on physical activities, dietary practices and smoking. Conclusion: Prevention of cardiovascular diseases and promotion of healthy lifestyle should consider women's specific health condition and socio-economic status; empower women to take charge of their health; facilitate women's informal and formal social support networks; provide culturally appropriate public education; create healthy environment with more recreational facilities for women and children.
-
-
-
The Association of Polymorphisms rs2055314, rs2272522 and rs331894 in Close Homologue of L1 gene (CHL1) with Schizophrenia in the State of Qatar
AbstractBackground: Previous reports demonstrated polymorphisms in the CHL1 gene located on chromosome 3p26 (close homologue of L1) are associated with schizophrenia among different ethnic populations. The aim of this study is to investigate the associations of the haplotypes of the theses genetic marker (SNPs) of CHL1 gene locus; rs2055314(C/T), rs2272522 (C/T) and rs331894 (A/G) with schizophrenia patients in Qatar populations.
Methods: A case control study association was carried out on 48 Qatari schizophrenic patients [from Psychiatry Hospital, Hamad Medical Corporation, Qatar] and 47 unrelated, healthy, control Qatari subjects. Schizophrenia was diagnosed according to the Diagnostic and Statistical Manual of Mental Disorders—Fourth Edition (DSM-IV) criteria for schizophrenia by two independent psychiatrists. Genotyping of SNP rs2055314 (C/T) rs2272522 (C/T) and rs331894 (A/G) was carried out by the 5' nuclease assay using TaqMan MGB probe by means of an ABI 7500 [Applied Biosystems].
Results: All SNPs are within the Hardy-Weinberg Equilibrium (HWE). The frequency distribution of the genotype rs2055314 (C/T) revealed that (35.30%), (31.25%), had CC and (35.30%), [58.33%] had CT, and (29.41%), [10.42%] had TT among control and schizophrenic patients, respectively with P value= 0.034. The minor allele frequency (T) was 0.361 for all subjects, with odds ratio =0.84 and 95% CI was (0.37–1.91) with P value= 0.67 between cases and controls. Using the genetic recessive model, odds ratio was 4.00 and 95% CI was (0.96–16.69) with P value= 0.05 between cases and controls. The frequency distribution of the genotype rs331894 (G/A) revealed that (12.77%), (6.25%), had GG and (40.42%), [50.00%] had GA, and (47.65%), [43.75%] had AA among control and schizophrenic patients, respectively with P value= 0.003. The minor allele frequency (G) was 0.407 for all subjects, with odds ratio =0.28 and 95% CI was (0.12–0.65) with P value= 0.002 between cases and controls. Using the genetic recessive model, odds ratio was 22.00 and 95% CI was (2.40–221.49) with P value= 0.0005 between cases and controls.
Conclusion: Our findings therefore strengthen the association between the CHL1 gene markers; rs2055314 and rs331894 with schizophrenia and also support the view that cell adhesion molecules could be involved in the etiology of this disease among Qatari patients.
-
-
-
The Associations of Transcription Factor 7-like 2 [TCF7L2] Gene with Gestational Diabetes Mellitus in State of Qatar
AbstractBackground: Genetic and environmental factors are highly related with gestational diabetes mellitus (GDM) and type 2 diabetes (T2D). Our objective was to explore whether some genetic variants such as rs12255372, rs7903146 of TCF7L2 gene are significantly associated with the risk of gestational diabetes mellitus among Arabian population.
Methods: A case control study was designed for such genetic association study. A total of 159 unrelated pregnant women (114 Arab; 40 gestational diabetes mellitus cases and 74 controls and 45 non-Arab; 11 gestational diabetes mellitus and 34 controls) were recruited from antenatal care unit of HMC. Blood sample were drawn for DNA extraction, then genotyped for TCF7L2 gene variants (rs12255372, and rs7903146) using TaqMan real time PCR assay. Plasma was used for biochemical analysis including glucose, insulin and adiponectin.
Results: The CC, CT and TT genotype frequencies of the TCF7L2 rs7903146 variants was not significantly different between the control and gestational diabetes mellitus cases (39.4%, 50,0%, 10.6% vs. 40.6%, 43.8%, and 15.6%, p=0.444) among Arab populations, respectively. Only, the T allele of rs12255372 variant was significantly associated with risk of gestational diabetes mellitus with odds ratio of 2.370, (95% of CI 1.010–5.563) with the p value of 0.047 among Arab subjects using the genetic dominant model after adjustment of BMI and age. The other polymorphism rs7903146 was not significantly associated with GDM among Arab and non-Arab subjects. No significant difference was observed for glucose, insulin and adiponectin hormone after 50g glucose load by genotyping of both variants.
Conclusion: The TCF7L2 rs12255372 variant is associated with an increased risk of gestational diabetes mellitus in Arab women. Further studies are needed with larger sample sizes.
-
-
-
Design of a Flexible Imaging Probe for Robotics Surgery
Authors: Carlos A Velasquez, Xianming Ye and W. Jong YoonAbstractMinimally invasive surgery assisted by robots has shown higher efficiency and precision. In spite of the good performance of the state of the art surgery robotics systems, the size and number of external incisions required by instruments should be reduced to lower the scarring and incisional pain experienced by the patient. Other important improvement derived from the use of smaller incisions is the diminishment of time required for recovery.
This research develops a flexible scope of small diameter driven by segmented multiple actuators to produce a deflection of the imaging probe at its distal end providing better visualization than the current rigid cameras. The research is conducted in cooperation with the Biorobotics Laboratory at the University of Washington USA, where our scope will be incorporated to Raven, a 7-DOF cable-actuated surgical robot.
The flexible scope is composed of three segments as illustrated in Figure 1, where it is possible to see that the device contains a ring of cables enclosed externally and internally by two different springs and a semi rigid covering. The main mechanism of bending is the compression and extension of the external spring at the distal end. The change of length in this spring is controlled by an active system that combines pulling and releasing actions through the ring of cables.
The device has advantages in its simplicity of actuation and the high elasticity and flexibility at the distal end. In the current state of the project, a testing system is under construction.
-
-
-
The Burden of Autism on Caregivers: A Snapshot From the State of Qatar
Authors: Muna Said Al-Ismail, Sara Ahmed, Nadir Kheir, Ola Ghoneim, Amy L Sandridge and Fadhila AlrawiAbstractBackground: Caring for a child diagnosed with autism is strongly linked to maternal caregiving burden. It forces family members to modify their daily lives to suit their different reality and it imposes social, psychological, and economical hardships. No previous research has assessed the burden associated with caring for a child with autism on the lives of caregivers in Qatar or the Gulf country region.
Objective: To assess the burden of autism on the lives of caregivers of children with autism in Qatar.
Methods: Two groups of caregivers of children between 3 to 17 years old were recruited. The caregivers of children with autism (Autistic Group, or AG) were recruited from two developmental pediatric and children rehabilitation clinics in Qatar. The caregivers of typically-growing children (Non-Autistic Group, or NAG) were recruited during their visit for a family clinic of a primary health care facility for routine medical check-up. Data collected from both groups included demographic information of caregivers and children and several questions aimed at assessing the burden of caring for a child with autism. Items in questions were developed after a thorough literature review.
Results: Children in the AG spent more time indoors, watching television, or sleeping than children in the NAG (p=0.05). Around 50% of the caregivers in AG did not wish to answer questions about whether they would encourage their children to get married or become parents when they grow up. Half of the sample in the AG utilizes special education classes and other facilities, and the remaining half has access problems. Religious faith helps the majority of caregivers in coping with the burden associated with caring for a child with autism.
Conclusions: This study provided evidence for the impact of caring for a child with autism on the life of the caregivers. It also gave an insight into areas relating to support provided to children with autism and their caregivers and the status of the children with autism in different aspects. The findings should help health policy-makers provide better and more focused supports to the children with autism and their families.
-
-
-
Loss Of Calreticulin Function Decreases NFKB Activity By Stabilizing IKB Protein
Authors: Nasrin Mesaeli, Kawthar Al-Dabhani, Shahrzad Jalali and Hamid MassaeliAbstractBackground: Transcription factor NFKB is activated by several processes including inflammation, endoplasmic-reticulum (ER) stress, increased Akt signaling and enhanced proteasomal degradation. Calreticulin is an ER Ca2+ binding chaperone, which regulates many cellular processes. Previously, we have shown that loss of calreticulin function results in the activation of ER stress that is accompanied by a significant increase in the proteasome activity. These changes increase the resistance of calreticulin deficient cells to apoptosis. A role for calreticulin has also been described in the regulation of immune response.
Objectives: To examine the role of calreticulin in the activation of NFKB signaling leading to enhanced resistance to apoptosis of these cells.
Methods:: Wild type and calreticulin deficient cells were used for measurement of transcriptional activity of NFKB. Cells were co-transfected with of NFKB reporter and-gal reporter plasmids followed by reporter gene assays. Western blot analysis was utilized to examine changes in protein expression.
Results: Our data illustrate a significant decrease in the basal transcriptional activity of NFKB upon loss of calreticulin function. Furthermore, treatment with lipopolysaccharide increased the transcriptional activity of NFKB in both the wild type and calreticulin deficient cells. However, the transcriptional activity of NFKB was still significantly lower in the calreticulin deficient cells as compared to the wild type cells. Our data also showed that the reduced NFKB activity in calreticulin deficient cells is not due to decreased p65 or p50 protein levels. To determine the mechanism of decreased NFKB activity we examined changes in IKB protein stability. Our data showed a significant increase in the IKB protein level due to decreased level of phosphorylated IKB protein. Furthermore, we illustrated that loss of calreticulin function resulted in increased protein phosphatase2A activity that was abolished by Okadaic acid treatment. Inhibition of IKB de-phosphorylation decreased its ubiquitination and proteasomal degradation.
Conclusion: Our data suggests that the reduced transcriptional activity of NFKB upon loss of calreticulin function is mediated via stabilization of IKB protein. To our knowledge, this is the first report on the role of calreticulin in the regulation of NFKB function.
-
-
-
Protective Effects of Melatonin on Cisplatin Induced Growth Inhibition of MCF-7 Breast Cancer Cells
Authors: Vignesh Shanmugam and Dietrich BüsselbergAbstractBackground: Pineal hormone melatonin (MEL) is a versatile molecule with diverse physiological roles ranging from circadian entrainment to anti-cancer effects. Clinical trials indicated that a co-application of cisplatin and melatonin improved the 1-year survival rate. Also, Futugami (2001) claimed melatonin enhances the sensitivity of an ovarian cancer cell line to cisplatin.
Objective: Here we study the anti-cancer effects of a co-application of cisplatin (CDDP, 1pM -10 mM - log10 scale) and melatonin (1pM – 100μM) on MCF-7 cells.
Methods: Cell viability was assessed through MTT assays and trypan blue exclusion tests.
Results: 1) a) CDDP causes concentration-dependent growth inhibition of MCF-7 cells at high concentrations (1–100 μM) over 24 and 48 hrs with an IC50 of 99.6 ± 5μM (24 hrs) b) Over a period of 6 days, 1uM CDDP causes significant (52.35 ± 0.64% of control) growth inhibition 2) MEL does not significantly inhibit MCF-7 cell growth over 24 hr and 6 day time points. 3) Simultaneous co-application of MEL with CDDP significantly (p ← 0.05) reverses 80?M CDDP induced growth inhibition over 24 hrs at physiological concentrations (0.1 – 10nM) (increase in growth by 21.4 ±} 1.8 %). 4) However, simultaneous co-application of MEL and CDDP does not significantly reverse the growth inhibition induced by 1μM CDDP over 6 days.
Conclusion: As reported by several labs, CDDP shows significant growth inhibition within 24 hrs only at high concentrations while long term growth inhibition is observed at low concentrations (1–10μM). The results indicate that the sub clone of MCF-7 cells used by us is melatonin “insensitive” as MEL does not have an anti-proliferative effect over the points tested. However, these cells are not completely irresponsive to MEL as MEL reverses CDDP induced growth inhibition at physiological concentrations. The question arises as to why such a “protection” is observed only at physiological concentrations. Moreover, this effect is only observed when acute cell death is induced at high concentrations and not at chronic low concentrations. To conclude, the results open up the interesting questions of the molecular basis of the protective effects of melatonin on CDDP induced cell death and melatonin “insensitivity”.
-
-
-
Prevalence and Awareness of AIDS among Qatari Community
Authors: Bothaina Saleh Elgahani and Asma Mohamed NetfaAbstractThis research was carried out to examine the awareness about HIV/AIDS. Introduction: AIDS is caused by the HIV virus, which attacks human immunity and can be incubated up to 10 years without symptoms. The infection can be transmitted through blood by injection or injury with infected tools and through sexual relations with infected persons. Rationale: As HIV/Aids awareness is lacking, we believe that educating the society on this matter will help prevent spreading the virus. Methods: Our research is based on a survey/questionnaire distributed to students, teachers and some friends who were selected randomly. In addition, we interviewed a clinical psychiatrist & focused on her AIDS patients while maintaining their anonymity. Results: Our research shows that 84% of the participants have a general idea about AIDS. Fortunately 74% of them recognize that AIDS is a dangerous disease. When asking about the mechanism of HIV work inside the body, 56% of the participants were able to answer the question correctly; and when asked about the ways of transmission, 47% said blood transfusion, 41% thought sexual relations, 46% answered needles, 45% thought it gets transferred from mother to fetus, 42% answered from lactating mother to her infant, and 40% thought through surgical tools of dentist. Interview results showed that the HIV carriers might have psychological disturbances according to their behavior. Conclusion: Most of the study population have come across this disease and are aware of its level of seriousness. Unfortunately they are not aware of further scientific such as ways of transmission, symptoms or prevention.
-
-
-
Possible Effects of Sport Practice on the Respiratory Volumes and Possible Effects on Heart Rates and Blood Pressure in Adolescent Females
More LessAbstractThis study was carried out to examine the difference between the respiratory volumes in athlete and non-athlete adolescent females, how does exercise affect the respiratory volumes and how could the effects of exercise on the respiratory system affect the heart rate? Introduction: The respiratory and circulatory systems are the most important systems in the body as they involve vital organs (lungs and heart). They are affected by several factors, which may improve or weaken their function. Exercise positively affects their function. Methods: In this study 2 groups were compared: athletic and non-athletic females using a spirometer to measure the respiratory volumes before and after running. The heart rate and blood pressure were also measured in both groups. The statistical analysis and graphs were done using excel 2007. Results: The pre-test mean vital capacity in non-athletes was 2600.0± 496.7 and in athletes: 2642.9 ± 340.9. Whereas post-test mean vital capacity of lungs in non-athletes was 2385.7 ±429.8 and in athletes: 2428.6 ±407.1. Pre-test mean blood pressure in non-Athletes was 94.6 ± 9.3mmHg and in athletes was 86 ± 8.1 mmHg but post-test mean blood pressure in non-athletes was 113.8 ± 31.6 mmHg and in athletes: 99.1 ± 8.8 mmHg. The pretest mean heart rate in non-athletes was 89±5.2 beat per min while in the athletes it was 86 ± 6.5 beat per min. Conclusion: On the short term exercise increases the heart rate, decreases the blood pressure, decreases the lung capacity and increases tidal volume. Whereas long term effects involve increased lung capacity and tidal volume; and decreased blood pressure and heart rate.
-
-
-
Management of Advanced Ectopic Pregnancy: Comparative Study between State of Qatar and Kingdom of Bahrain
More LessAbstractBackground: Ectopic pregnancy is an increasing health risk for women that cause maternal death in the first trimester. The incidence of ectopic pregnancy is 1–2% of pregnant women. The Fallopian tubes are the most common site of implantation (95.5%). Risk factors are higher in women with damaged fallopian tubes. Ultrasound (US) and (-hCG) are the diagnostic tools. Management includes medical (methotrexate(MTX)) and surgical (laparotomy or laparoscopic) approaches.
Objectives: The objective of this study is to shed the light on the management of advanced ectopic pregnancy diagnosed according to the (-hCG>5000) or the presence of fetal heart beat using US, in relation to age, medical history, diagnosis, treatment in both Qatar and Bahrain.
Methods: This study was conducted at Hamad Medical Corporation (Qatar) and Bahrain Defense Force Hospital (Bahrain). After obtaining the ethical approval needed, all cases of ectopic pregnancies between 2007-2011 were included. Data were collected from medical records on the approved data-collecting sheet then statistically analyzed using SPSS 19 software. Analyses included descriptive statistics, cross tab and Chi-square Tests, 95% confidence intervals and odd ratios.
Results: Out of 534 cases of ectopic pregnancies enrolled in this study, 127 (23.8%) were from Bahrain and 407 (76.2%) were from Qatar. The percentage of advance cases was 15% from Bahrain and 41% from Qatar. In Bahrain, treatments utilized are: laparatomy and salpingectomy (84.2%), (5.3%) MTX alone and MTX followed by laparatomy and salpingectomy (10.5%), while in Qatar it was laparoscopy and salpingectomy(77.6%), MTX alone (19.4%), MTX followed by laparoscopy and salpingectomy (3%). In both countries, Left tube is the common side of implantation in case of advance ectopic (52.2%) and high incidence (21.7%) of tubal rupture was found in all advanced ectopic cases.
Conclusions: Management of advanced ectopic pregnancy was mainly surgical based on gestational sac size and patient's age. Laparoscopy in Qatar and laparatomy in Bahrain were the treatment of choice to treat advance ectopic cases. Further investigation to compare mother's fertility after different surgical approaches is recommended.
-
-
-
Implication of Protein-C in Thrombophilic State and Metastatic Dissemination
Authors: Hamda Ahmad Al-Thawadi, S Mirshahi, H Al Farsi, A Rafii, A Therwat, J Soria and M MirshahiAbstractThe coagulation/fibrinolytic system controls the intravascular fibrin homeostasis; in addition to participating in a wide variety of physio-pathological processes. The components of the system have an influence on tumor metastasis, growth and invasion. This is a result of their involvement in tumor matrix construction, angiogenesis and cell migration.
Thrombosis of unexplained etiology among healthy and cancer patients; is a major cause of death. Several homeostatic markers are currently used to predict the advent of thrombosis. However, none of these markers directly indicate the course and progression of the disease; thus thrombosis remains unexplained.
Endothelial Protein C Receptor EPCR gene carries 13 single nucleotide polymorphisms, which define 3 haplotypes: A1, A2 and A3. One of these, A3, encodes a protein, which is more sensitive than the other two in shedding enzymes. High levels of protein C are determined by PROCR haplotype 3. A3 haplotype reflects a high soluble Endothelial Protein C Receptor (sEPCR) level. Therfore, it is a candidate risk factor for venous thrombosis. We observed that the plasma concentration of sEPCR in cancer patients was much higher than that observed in controls. We suggest that sEPCR released from malignant cells could serve as a “trap” for protein C, preventing it's binding to EPCR on the surface of endothelial cells and induced thrombotic state.
We developed a method, based on activated Partial Thromboplastin Time, in order to analyze the ability of (EPCR) on the cancer cell membrane to trap circulating Activated Protein C (APC). This test is in conjunction with other specific tests used for assessing thrombotic state, such as the one to quantitate soluble fibrin and d-dimer.
Previous study of lung cancer patients, done by Department of Pathology, Free University Medical Center 2002, revealed a marked association between high EPCR levels; and poor survival or relapse in patients with stage I lung adenocarcinoma. The aim of our study is to investigate the role of sEPCR as a cause of thrombotic disorder, among cancer patients. Furthermore, to detect whether EPCR of cancer cells is haplotype 3, that affect on the level of sEPCR.
-
-
-
Epithelial to Mesenchymal Transition in Ovarian Cancer Cell
Authors: Halema Al-Farsi, Raphael Lis, J Soria, H Al-Thawadi, A Therwat, A Rafii and M MirshahiAbstractEpithelial ovarian cancer is the most lethal gynecologic malignancy with the majority of cases being diagnosed after the disease has become metastatic according to the report by Obstetrics and Gynecology, Duke University Medical Center USA, 2008.
Consequently, genetic and epigenetic changes that disturb motility are likely to be important for the pathogenesis of ovarian cancer. Although ovarian cancer can be cured in up to 90% of cases while still confined to the ovary, approximately 70% are diagnosed after the occurrence of peritoneal dissemination, when the cure rate reduces to less than 30% according to recent studies by Global Cancer Statistic, CA Cancer 2011.
Recent reports have shown 25% of most cancerous cells within tumors have the features of cancer stem cells (CSCs). CSCs have been identified on the basis of their ability to self-renew and to have the capacity to differentiate into cancer cells and also form tumors in animal model.
We already demonstrated that the majority of cells of ovarian cancer cell line (OVCAR) expressed CD133 and CD117 antigen. The CD133 antigen is a 120 kDa membrane glycoprotein, detected first time in CD34+ hematopoietic stem cells and thus this antigen has been widely used to identify and facilitate the isolation of hematopoietic stem and progenitor cells. CD117 or stem cell factor receptor (c-Kit), also detected in Haemopoietic stem and progenitor cells. This protein is a type 3 transmembrane receptor for MGF (mast cell growth factor).
CD133and CD117 has been considered as a marker of CSCs. Also OVCAR CD133- cells subpopulation in “in vitro” culture can generate a subpopulation of OVCAR CD133+ cells, probably via Epithelial to Mesenchymal Transition (EMT).
EMT describes a mechanism by which cells lose their epithelial characteristics and acquire more migratory mesenchymal properties. It also seems to have a key role in the acquisition of invasive and migratory properties in many types of carcinoma cells.
We aim to determine whether the transformation of these cancer cells in CSCs is dependent on the tumor type and on signaling pathways. We will be using genomic and proteomic analysis OVCAR CD133+/- and CD117+/- cells for targeting the EMT pathway.
-
-
-
Undifferentiated iPS Cells Do Not Regenerate Functional Lung Tissue When Seeded on Native Lung Extracellular Matrix under Biomimetic Culture Conditions
Authors: Heba Al-Siddiqi, Bernhard Jank, Roger Ng, Jeremy Song, Joseph Vacanti and Harald OttAbstractPerfusion-decellularized native lungs seeded with human BJ
RNA-induced pluripotent stem (BJ-RiPS) and umbilical vein endothelial cells failed to regenerate functional lung tissue as quantified by immunohistochemistry (no detection of TTF1,CC10,and Pro-SPB), gene expression (non-significant differences in lung-specific gene expression as compared to cells cultured under standard conditions), and in vitro lung gas exchange properties. Histological analysis of orthotopically transplanted BJ-RiPS lungs revealed a teratoma (detection of ectoderm: TuJ1, mesoderm: SMA, and endoderm: TTF1).
-
-
-
Genetics of Obesity
Authors: Mashael Nedham Al-Shafai, Phillippe Froguel and Mario FalchiAbstractObesity is a major health problem that has reached epidemic levels worldwide. Obesity is considered a highly heritable and genetically heterogeneous disorder. Despite the improvement in our understanding of the genetic basis of obesity, the underlying genetic cause of most families with extreme obesity is still unknown.
In this study, we aim to elucidate the missing heritability of obesity in bariatric surgery patients with familial history of obesity. About 100 probands from France, the UK and Qatar will be screened for known obesity variants in MC4R (and LEP if belong to consanguineous family) by Sanger sequencing and for two obesity-causing microdeletions at chr16p11.2 by MLPA. The families of ten of these probands not showing known monogenic obesity variants will be further analysed to seek new rare obesity-causing variants by whole exome sequencing and Illumina genotyping. We will examine the effect of the identified variants at the gene expression level by performing expression profiling analysis in blood and insulin-responsive tissues (muscle, liver, subcutaneous and visceral fat) from the probands and in blood for the other family members. Moreover, we will investigate the effect of the variants on obesity surgery outcomes such as weight loss and reoperation rates.
Bariatric surgery offers a valuable opportunity to collect tissues from obese patients that can allow the integration of genetic information with gene expression to investigate the genetic basis of obesity. This research will provide novel insight into better health care protocols such as personalised medicine and genetic counselling for obesity, and could lead to the development of better treatment options for the future.
-
-
-
Genetics of Type 2 Diabetes among Qatari Families
Authors: Wadha Al-Muftah, Mario Falchi, Ramin Badii and Philippe FroguelAbstractThe prevalence of type 2 diabetes mellitus (T2D) is increasing rapidly worldwide with figures being projected to reach 700 million and 366 million by 2030 respectively, according to the recent reports by the World Health Organization 2010 and the International Diabetes Federation 2010. T2D development has been shown to be driven by both environmental and genetic factors.
Consanguinity among Middle-Eastern population, especially the Gulf region, has proved to play a major role in predisposing to multiple hereditary conditions such as cancer, hypertension, and T2D, the latter showing a moderately high prevalence (16.7%) among Qataris.
In this study we aim to identify novel genetic variants and clarify new molecular pathways of T2D in the Qatari population. We will take the advantage of the advanced technologies in genome wide scan and next generation sequencing to investigate a large three generation Qatari family with a history of early onset T2D for the initial stage of the study. More consanguineous families will be recruited for this project and will undergo the same investigational steps in order to identify shared novel mutations between the different family members.
To date, several approaches, such as candidate gene studies, linkage analysis, and genome-wide association studies (GWAS) have been used to identify genetic variants involved in the pathophysiology of T2D and glucose homeostasis. Among these, GWAS has been the most successful approach at the moment to uncovering common genetic variants involved in the disease susceptibility. Next generation whole exome sequencing is a new promising approach to gather novel insights into genes and pathways involved in T2D susceptibility, also allowing the discovery of potential rare mutations.
This study will use next generation sequencing technology to discover potential causative mutations segregating in diabetic inbred Qatari families, and possibly relevant to the Qatari population.
-
-
-
Genetic and Epigenetic Investigations of SNCA in Parkinson's Disease
Authors: Kholoud Nedham Alshafai, Alexandra I F Blakemore and Lefkos MiddletonAbstractParkinson's disease (PD) (OMIM168600) is the second most common age-related neurodegenerative disorder worldwide with a prevalence of more than 1% in people over 65 years old. The major hallmark of PD brain change is the formation of Lewy bodies, which are mainly composed of a protein called alpha-synuclein (encoded by the SNCA gene), aggregated together with other proteins.
Genetic variants of SNCA have been reported to be involved in both familial as well as sporadic cases of PD. Many of these variants result in the over-expression of the encoded protein making it prone to aggregation.
This report describes investigation of methylation of the two CpG islands in SNCA in brain samples from PD patients.
Fifty three DNA samples were made from cerebellum of PD brains, to add to 268 existing DNA samples. In the first part of the study, confirmation of suspected monogenic PD mutations was carried out using PCR and sequencing. However, no mutation was detected. Possible reasons for the discrepancy between predicted and observed results are discussed. In addition, 250 PD cases were screened for three monogenic mutations in SNCA using commercial service and found that none of these cases have the mutations.
In the second part of the study, DNA methylation of two SNCA CpG islands was assessed in seven different brain regions of ten PD cases using bisulfite sequencing. No significant difference was observed in DNA methylation of CpG 1, as well as CpG 2, when compared to the studied brain regions.
Genetic and epigenetic studies on PD can help to provide better understanding of the mechanisms underlying the disease and its progression, enhancing our ability to discover and develop better treatment options for the future.
-
-
-
Non-invasive Physiological Monitoring for the Detection of Stressful Conditions
AbstractChronic stress is a leading risk factor for heart diseases, diabetes, asthma and depression. However, physicians find it difficult to continuously track a person's stress levels throughout the day, as current techniques of electrocardiogram and blood pressure monitoring are not practical. There is thus a critical need for a non-invasive, ambulatory device to track physiological stress over extended periods of time. Such information would allow physicians to assess precisely the affect of stress and determine the most appropriate interventions.
The primary objective of this study was to investigate the relationship between non-invasive, physiological signal parameters and the stress level as perceived by the subject. The mapping of physiological parameters onto stress levels to accurately monitor the stress levels in a subject under various conditions will assist the diagnosis of subjects at risk of various stress related disorders.
An ambulatory, wireless device was developed with respiratory rate, galvanic skin response and heart rate sensors, which the subjects can wear comfortably while performing their everyday tasks. An experiment involving 13 activities with different stress levels was conducted on 22 subjects during which physiological data was collected using the developed device. While participating in the experiment, subjects had to record the stress level of each activity on a scale of 1 and 7. The data collected was processed in MATLAB, appropriate signal parameters extracted and then correlated with the subject's perceived stress levels.
Analysis of the data showed that the stress levels varied as the subjects progressed into different activities due to their varying current mental states. The perceived nature of stress varied considerably amongst the individuals with certain activities able to induce a stronger variation in stress compared to others. The derived features were shown to be effective in tracking the variation in stress induced in the subjects.
Results of the experiment showed that the developed device was effective in recording non-invasive physiological for use in tracking the stress levels and mental state of the subjects. Further work is being done to develop an effective model that accurately predicts stress levels based on the physiological data collected.
-
-
-
An Arabic-Based Tutorial System for Children with Special Needs
Authors: Jihad Mohamad AL Jaam, Moutaz Saleh, Ali Jaoua and Abdulmotaleb ElsaddikAbstractIn spite of the current proliferation of the use of computers in education in the Arab world, complete suites of solutions for students with special needs are very scarce. This paper presents an assistive system managing learning content for children with moderate to mild intellectual disabilities. The system provides educational multimedia contents, inspired from the local environment, in different subjects such as math, science, religion, daily life skills, and others to target specific learning goals suitable for this group of learners. The system tracks the individual student progress against the student individualized learning plan assigned by the specialized teacher and according to the learner abilities. Upon completion of learning a particular task, the system will test the learner to order a set of sub-tasks in its logical sequence necessary to successfully accomplish the main task. The system also facilitates deploying intelligent tutoring algorithms to automatically correct mistakes after a number of trials working adaptively with the learner to successfully learn how to complete the task.
-
-
-
Spider: A System for Finding Illegal 3D Video Copies
Authors: Mohamed Hefeeda and Naghmeh KhodabakhshiAbstractThree-dimensional (3D) videos are getting quite popular, and equipment for recording and processing them are becoming affordable. Creating 3D videos is expensive. Thus, protecting 3D videos against illegal copying is an important problem. We present a novel system for finding 3D video copies. Our system also identifies the location of the copied part in the reference video. The system can be used, for example, by video content owners, video hosting sites, and third-party companies to find illegally copied 3D videos. To the best of our knowledge, this is the first complete 3D video copy detection system in the literature.
Detecting 3D video copies is a challenging problem. First, comparing numerous numbers of frames from potential copies against reference videos is computationally intensive. Second, many modifications occur on copied videos; some of them are intentional to avoid detection and others are side effects of the copying process. For example, a copied video can be scaled, rotated, cropped, transcoded to a lower bit rate, or embedded into another video. The contrast, brightness, and colors of a video can also be manipulated. Furthermore, 3D videos come in various encoding formats, including stereo, multiview, video plus depth, and multiview plus depth. Changing the format is possible during copying, which complicates the detection process. Finally, new views can be synthesized from existing ones. These views display the scene from different angles, and thus reveal different information than in original views. For example, an object occluded in one view could appear in another.
We implemented the proposed system and evaluated its performance using many 3D videos. We created a large set of query videos with 284 videos to represent all practical scenarios. Our results show that the proposed system achieves high precision and recall values in all scenarios. Specifically, our system results in 100% precision and recall when copied videos are unmodified parts of original videos, and it produces more than 90% precision and recall when copied videos are subjected to various transformations. Even in extreme cases where each video is subjected to five different transformations, our system yields more than 75% precision and recall.
-
-
-
Interference Identification for Next Generation Wireless Networks
Authors: Serhan Yarkan and Khalid A QaraqeAbstractWith the huge success of cellular mobile radio communications, demand for wireless services, applications, and technologies is expected to increase further. Such an increase forces recently emerging technologies (e.g., 4G) to coexist with the old ones (e.g., 2G–3G) in next generation wireless networks (NGWNs). In order for NGWNs to support all of these services and applications with the ever-increasing demand, wireless radio interference needs to be handled in an effective manner. Interference is a phenomenon which degrades the overall system capacity, affects the quality of service, and causes call drops and unnecessary handoffs in cellular mobile systems.
Interference is a very important concept from the perspective of military and of national security. Both unintentional and intentional interference, which is also known as jamming, should be cleared away as soon as possible for security reasons. Therefore, identification of interference in a reliable manner is a crucial task for all of the future wireless communications systems.
In this study, identification of radio interference in NGWNs is established. The proposed method takes into account the general characteristics of wireless propagation environments.
Since it is difficult to completely define a general wireless propagation environment, statistical properties of widely used propagation environment classification such as urban and suburban is analyzed. Both first- and second-order of the statistical characteristics of wireless propagation environments are considered.
It is shown that the proposed method can identify the presence of interference under practical scenarios in a reliable manner.
In addition, it is demonstrated that the absence of a priori knowledge about the ambient noise power does not affect the performance of the proposed method.
Interference management is predicted to be an essential part of the system design for emerging NGWNs. Interference is also extremely important for military and national security applications and services. Therefore, identification of any form of interference in a reliable manner is of crucial importance. In this study, a method that can identify the presence of interference reliably under practical scenarios is proposed. The method proposed does need any a priori information to identify the interference.
-
-
-
Towards Node Cooperation in Mobile Opportunistic Networks
Authors: Abderrahmen Mtibaa and Khaled HarrasAbstractMobile devices such as smart-phones and tablets are becoming ubiquitous, with ever increasing communication capabilities. In situations where the necessary infrastructure is unavailable, costly, or overloaded, opportunistically connecting theses devices becomes a challenging area of research. Data is disseminated using nodes that store-carry-and-forward messages across the network. In such networks, node cooperation is fundamental for the message delivery process. Therefore, the lack of node cooperation (e.g., a node may refuse to act as a relay and settle for sending and receiving its own data) causes considerable degradation in the network. In order to ensure node cooperation in such networks, we investigate three main challenges: (i) ensuring fair resource utilization among participating mobile devices, (ii) enabling trustful communication between users, and (iii) guaranteeing scalable solutions for large number of devices.
(i) Fairness is particularly important for mobile opportunistic networks since it acts as a major incentive for node cooperation. We propose and evaluate FOG - a real-time distributed framework that ensures efficiency-fairness trade-off for users participating in the opportunistic network.
(ii) Since users may not accept to forward messages in opportunistic networks without incentives, we introduce a set of trust-based filters to provide the user with an option of choosing trustworthy nodes in coordination with personal preferences, location priorities, contextual information, or encounter-based keys.
(iii) Mobile opportunistic solutions should scale to large networks. Our hypothesis is that in large-scale networks, mobile-to-mobile communication has its limitations. We therefore introduce CAF, a Community Aware Forwarding framework, which can easily be integrated with most state-of-the-art algorithms, in order to improve their performance in large-scale networks. CAF uses social information to break down the network into sub-communities, and forward message within and across sub-communities.
In the three contributions we propose above, we adopt a real-trace driven approach to study, analyze, and validate our algorithms and frameworks. Our analysis is based on different mobility traces including the San Francisco taxicab trace, traces collected from conferences such as Infocom’06 and CoNext’07, and Dartmouth campus wireless data set.
-
-
-
Massive Parallel Simulation of Motion of Nano-Particles at the Near-Wall Region in a Micro-Fluidics System
Authors: Othmane Bouhali, Reza Sadr and Ali SheharyarAbstractOne of the major challenges in Computational Fluid Dynamics (CFD) is limitation in the available computational speeds, especially when it comes to N-body problems. Application of the graphics processing units (GPU) are considered as an alternative to the traditional CPU usage for some CFD applications to fully utilize the computational power and parallelism of modern graphics hardware. In the present work a Matlab simulation tool has been developed to study the flow at the wall-fluid interface at nano scale in a micro-fluidic system. Micro-fluidics has become progressively important in recent years in order to address the demand for increased efficiency in a wide range application in advanced systems such as “lab-on-a-chip”. Lab-on-a-chip devices are miniature micro fluidics (labs) to perform biological test such as proteomics, or chemical analysis/synthesis of very exothermic reactions. For simulation in such cases, it takes few hours to simulate a particular set of parameters even when run in parallel mode at the TAMUQ supercomputer. In order to accelerate the simulation, the Matlab program has been ported to C/C++ program that exploits the GPU's (Graphics Processing Unit) massive parallel processing capabilities. Results from both methods will be shown and conclusions be drawn.
-
-
-
Statistical Mixture-Based Methods and Computational Tools for High-Throughput Data Derived in Proteomics and Metabolomics Study
Authors: Halima Bensmail, James Wicker and Lotfi ChouchaneAbstractQatar is accumulating substantial local expertise in biomedical data analytics. In particular, QCRI is forming a scientific computing multidisciplinary group with a particular interest in machine learning, statistical modeling and bioinformatics. We are now in a strong position to address the computational needs of biomedical researchers in Qatar, and to prepare a new generation of scientists with a multidisciplinary expertise.
The goal of genomics, proteomics and metabolomics is to identify, and characterize the function of genes, proteins, and small molecules that participate in chemical reactions, and are essential for maintaining life. This research area expands rapidly and holds a great promise in the discovery of risk factors and potential biomarkers of diseases such as obesity and diabetes, the two areas of increasing concern in Qatar population.
In this paper, we develop new statistical modeling techniques of clustering based on mixture models with model selection of large biomedical datasets (proteomics and metabolomics). Deterministic and Bayesian approach are used. The new approach is formulated within the multivariate mixture-model cluster analysis to handle both normal (Gaussian) and non-normal (non-Gaussian) large dimensional data.
To choose the number of component mixture clusters we develop the model selection with information measure of complexity (ICOMP) criterion of the estimated inverse-Fisher information matrix. We have promising preliminary results which, suggest the use of our algorithm to identify obesity susceptibility genes in humans in a genome-wide association study and in Mass spectrum data generated for adipocyte tissue for an obesity study.
-
-
-
Annotating a Multi-Topic Corpus for Arabic Natural Language Processing
Authors: Behrang Mohit, Nathan Schneider, Kemal Oflazer and Noah A. SmithAbstractHuman-annotated data is an important resource for most natural language processing (NLP) systems. Most linguistically annotated text for Arabic NLP is in the news domain, but systems that rely on this data do not generalize well to other domains. We describe ongoing efforts to compile a dataset of 28 Arabic Wikipedia articles spanning four topical domains—sports, history, technology, and science. Each article in the dataset is annotated with three types of linguistic structure: named entities, syntax and lexical semantics. We adapted traditional approaches to linguistic annotation in order to make them accessible to our annotators (undergraduate native speakers of Arabic) and to better represent the important characteristics of the chosen domains.
For the named entity (NE) annotation, we start with the task of marking boundaries of expressions in the traditional Person, Location and Organization classes. However, these categories do not fully capture the important entities discussed in domains like science, technology, and sports. Therefore, where our annotators feel that these three classes are inadequate for a particular article, they are asked to introduce new classes. Our data analysis indicates that both the designation of article-specific entity classes and the token-level annotation are accomplished with a high level of inter-annotator agreement.
Syntax is our most complex linguistic annotation, which includes morphology information, part-of-speech tags, syntactic governance and dependency roles of individual words. While following a standard annotation framework, we perform quality control by evaluating inter-annotator agreement as well as eliciting annotations for sentences that have been previously annotated so as to compare the results.
The lexical semantics annotation consists of supersense tags, coarse-grained representations of noun and verb meanings. The 30 noun classes include person, quantity, and artifact; the 15 verb tags include motion, emotion, and perception. These classes provide a middle-ground abstraction of the large semantic space of the language. We have developed a flexible web-based interface, which allows annotators to review preprocessed text and add the semantic tags.
Ultimately, these linguistic annotations will be publicly released, and we expect that they will facilitate NLP research and applications for an expanded variety of text domains.
-
-
-
Challenges and Techniques for Dialectal Arabic Speech Recognition and Machine Translation
Authors: Mohamed Elmahdy, Mark Hasegawa-Johnson, Eiman Mustafawi, Rehab Duwairi and Wolfgang MinkerAbstractIn this research, we propose novel techniques to improve automatic speech recognition (ASR) and statistical machine translation (SMT) for dialectal Arabic. Since dialectal Arabic speech resources are very sparse, we describe how existing Modern Standard Arabic (MSA) speech data can be applied to dialectal Arabic acoustic modeling. Our assumption is that MSA is always a second language for all Arabic speakers, and in most cases we can identify the original dialect of a speaker even though he is speaking MSA. Hence, an acoustic model trained with sufficient number of MSA speakers will implicitly model the acoustic features for the different Arabic dialects. Since, MSA and dialectal Arabic do not share the same phoneme set, we propose phoneme sets normalization in order to crosslingually use MSA in dialectal Arabic ASR. After normalization, we applied state-of-the-art acoustic model adaptation techniques to adapt MSA acoustic models with little amount of dialectal speech. Results indicate significant decrease in word error rate (WER). Since it is hard to phonetically transcribe large amounts of dialectal Arabic speech, we studied the use of graphemic acoustic models where phonetic transcription is approximated to be word letters instead of phonemes. A large number of Gaussians in the Gaussian mixture model is used to model missing vowels. In the case of graphemic adaptation, significant decrease in WER was also observed. The approaches were applied with Egyptian Arabic and Levantine Arabic. The reported experimental work was performed while the first author was at the German University in Cairo in collaboration with Ulm University. This work will be extended at Qatar University in collaboration with the University of Illinois to cover ASR and SMT for Qatari broadcast TV. We propose novel algorithms for learning the similarities and differences between Qatari Arabic (QA) and MSA, for purposes of automatic speech translation and speech-to-text machine translation, building on our own definitive research in the relative phonological, morphological, and syntactic systems of QA and MSA, and in the application of translation to interlingual semantic parse. Furthermore, we propose a novel efficient and accurate speech-to-text translation system, building on our research in landmark-based and segment-based ASR.
-
-
-
Pipeline Inspection using Catadioptric Omni-directional Vision Systems
Authors: Othmane Bouhali, Mansour Karkoub and Ali SheharyarAbstractOil and gas companies spend millions of dollars on inspection of pipelines every year. The type of equipment used in carrying out the inspections is very sophisticated and requires very specialized manpower. In this article we present a novel approach to pipeline inspection using a small mobile robot and an omnidirectional vision system. It is a combination of a reflective convex mirror and a perspective camera. Panoramic videos captured by the camera from the mirror are either stored or transmitted to a monitoring station to examine the inner surface of the pipeline. The videos are prepped, unwrapped, and unwarped to get a realistic image of the whole inner surface area of the pipeline. Several mirror shapes and unwarping techchniques are used here to show the efficiency of this inspection technique. The image acquisition is instantaneous with a large field of view, which makes their use in dynamic environment suitable. The cost of such system is relatively low compared to other available inspections systems.
-
-
-
Development of a Telerobotic System to Assist the Physically Challenged Using Non-Contact Vision-Based Sensing and Command
Authors: Mansour Karkoub and M-G. HerAbstractIt is often a problem for a physically challenged person to perform a simple routine task such as eating, moving around, and picking up things on a shelf. Usually, these tasks require assistance from a capable person. However, this total or partial reliance on others for daily routines may be bothersome to the physically challenged and diminishes their self-esteem. Moreover, getting around in a wheel chair, for example, requires the use of some form of a joystick, which is usually not so user-friendly. At Texas A&M Qatar, we developed two vision-based motion detection and actuation systems, which can be used to control the motion of a wheel chair without the use of a joystick and remotely control the motion of a service robot for assistance with daily routines. The first vision system detects the orientation of the face whereas the second detects the motion of color tags placed on the person's body. The orientation of the face and the motion of the color tags are detected using a CCD camera and could be used to command the wheel chair and the remote robot wirelessly. The computation of the color tags’ motion is achieved through image processing using eigenvectors and color system morphology. Through inverse dynamics and coordinate transformation, the motion of the operator's head, limbs, and face orientation could be computed and converted to the appropriate motor angles on the wheelchair and the service robot. Our initial results showed that it takes, on average, 65 milliseconds per calculation. The systems performed well even in complex environments with errors that did not exceed 2 pixels with a response time of about 0.1 seconds. The results of the experiments are available at:
http://www.youtube.com/watch?v=5TC0jqlRe1U, http://www.youtube.com/watch?v=3sJvjXYgwVo, http://www.youtube.com/watch?v=yFxLaVWE3f8, and http://www.youtube.com/watch?v=yFxLaVWE3f8.
It is our intent to implement the vision based sensing system on an actual wheelchair and service robot and test it using a physically challenged person.
-
-
-
A Simulation Study of Underwater Acoustic Communications in the North Field of Qatar
Authors: Bahattin Karakaya, Mazen Omar Hasna, Murat Uysal, Tolga Duman and Ali GhrayebAbstractQatar is a leading natural gas producer and exporter in the world. Most of the natural gas (and oil) of Qatar is extracted from offshore wells, and then it is transferred to onshore for processing. In addition, Qatar is connected to UAE by one of the world's longest underwater pipelines (managed by Dolphin Energy), to transfer processed gas from the offshore north field to the UAE. Security of such critical offshore infrastucture against threats along with the environmental and preventive maintenance monitoring (e.g., pollution, leakage) are of utmost importance. A wireless underwater sensor network can be deployed for the security and safety of underwater pipelines. However, underwater acoustic communication brings its own challenges such as limited transmission range, low data rates and link unreliability.
In this paper, we propose “cooperative communication” as an enabling technology to meet the challenging demands in underwater acoustic communication (UWAC). Specifically, we consider a multi-carrier and multi-relay UWAC system and investigate relay (partner) selection rules in a cooperation scenario. For relay selection, we consider different selection criteria, which rely either on the maximization of signal-to-noise ratio (SNR) or the minimization of probability of error (PoE). These are used in conjunction with so-called per-subcarrier, allsubcarriers, or subcarrier grouping approaches in which one or more relays are selected.
In our simulation study, we choose an offshore area in the North Eastern side of Qatar (which coincides with the North Field) and conduct an extensive Monte Carlo simulation study for the chosen location to demonstrate the performance of the proposed UWAC system. Our channel model builds on an aggregation of both large-scale path loss and small-scale fading. For acoustic path loss modeling, we use the ray-tracing algorithm Bellhop software to precisely reflect the characteristics of the simulation location such as the sound speed profile, sound frequency, bathymetry, type of bottom sediments, depths of nodes, etc (See Fig.1 ). Our simulation results for the error rate performance have demonstrated significant performance improvements over direct transmission schemes and highlighted the enhanced link reliability made possible by cooperative communications (See Fig.2 ).
-
-
-
Video Aggregation: Delivering Videos over Wireless Smart Camera Networks
Authors: Vinay Kolar, Vikram Munishwar, Nael Abu-Ghazaleh and Khaled HarrasAbstractThe proliferation of wireless technologies and inexpensive network-cameras has enabled low-cost and quick deployment of cameras for several surveillance applications, such as traffic monitoring and border control. Smart Camera Networks (SCNs) are networks of cameras that self configure and adapt to improve their operation and reduce the demand on human operators. However, SCNs are constrained by the ability of the underlying wireless network. Streaming video over a network requires substantial bandwidth, and strict Quality-of-Service (QoS) guarantees. In contrast, existing wireless networks have limited bandwidth, and the protocols do not guarantee QoS. Thus, for SCNs to scale beyond a small number of cameras, it is vital to design efficient video delivery protocols that are aware of the limitations of the underlying wireless network.
We propose to use Video Aggregation, a technique that enables efficient delivery of video in SCNs by combining related video streams. Existing SCNs use traditional routing protocols where intermediate network routers simply forward the video packets from cameras towards the video analysis center (or base-station). This is inefficient in SCNs since multiple cameras often cover overlapping regions of interest, and video information of these regions are redundantly transmitted over the network. The proposed video aggregation protocol eliminates redundant transmissions by dynamically pruning the overlapping areas at the intermediate routers. The routers blend the received streams into one panoramic video stream with no overlaps. Aggregation also dynamically controls the streaming rate to avoid network congestion and packet drops; the routers adjust the rate of the outgoing video by estimating the available network bandwidth. Thus, base-stations receive video frames with minimal packet drops, thus improving the quality of received video.
Our testbed and simulation results show that aggregation outperforms traditional routing both in terms of received video quality and network bandwidth usage. Our testbed experiments show that aggregation improves the received video quality (the Peak Signal-to-Noise-Ratio metric) by 54%. In larger networks, we observed that aggregation eliminates up to 90% of packet drops that were observed in SCNs with traditional routing. In future, we plan to develop a suite of video delivery protocols, which include SCN-aware scheduling and transport protocols.
-
-
-
Efficient Sequence Alignment Using MapReduce on the Cloud
More LessAbstractOver the past few years, advances in the field of molecular biology and genomic technologies have led to an explosive growth of digital biological information. The analysis of this large amount of data is commonly based on the extensive and repeated use of conceptually parallel algorithms, most notably in the context of sequence alignment. Cloud computing provides scientists with a completely new model of utilizing the computing infrastructure. Cloud computing model is excellent in dealing with such bioinformatics applications, which require both management of huge amounts of data and heavy computations.
The study aims at transforming a recently developed bioinformatics sequence alignment tool, named BFAST, to the cloud environment. The MapReduce version of the BFAST tool will be used to demonstrate the effectiveness of the MapReduce framework and the cloud-computing model in handling the intensive computations and management of the huge bioinformatics data.
A number of existing tools and technologies are utilized in this study to achieve an efficient transformation of the BFAST tool into the cloud environment. The implementation is mainly based on two core components; BFAST and MapReduce. BFAST is a software package for aligning next generation genomic reads against a target genome with a very high accuracy and reasonable speed. MapReduce general-purpose parallelization technology [in its open source implementation, Hadoop] appears to be particularly well adapted to the intensive computations and huge data storage tasks involved in the BFAST sequence alignment tool.
The MapReduce version of the BFAST tool is expected to offer better results than the original one in terms of maintaining good computational efficiency, accuracy, scalability, deployment and management efforts.
The study demonstrates how a general-purpose parallelization technology, i.e. MapReduce running on the cloud, can be tailored to tackle the class of bioinformatics problems with good performance and scalability, and, more importantly, how this technology could be the basis of a computational parallel platform for several problems in the context of bioinformatics. Although the effort of transforming existing bioinformatics algorithms from local compute infrastructure is not trivial, the speed and flexibility of cloud computing environments provide a substantial boost with manageable cost.
-
-
-
Realistic Face and Lip Expressions for a Bilingual Humanoid Robot
Authors: Amna Alzeyara, Majd Sakr, Imran Fanaswala and Nawal BehihAbstractHala is a bi-lingual (Arabic and English) robot receptionist located at Carnegie Mellon in Qatar. Hala is presented to users as a 3-D animated face on a screen. Users type to her, and she replies in speech synced with realistic facial expressions. However there are two existing problems with the robot. First, Hala's animation engine does not fully adhere to existing research on face dynamics, which makes it difficult to create natural and interesting facial expressions. Natural expressions help towards an engaging user experience by articulating non-verbal aspects (ex: confusion, glee, horror, etc). Second, while speaking in Arabic lip-movements are not realistic because they were adopted from English utterances. In this work we address these two limitations.
Similar to the movie and video-game industry, we leverage Paul Ekman's seminal work on Facial Action Coding System (FACS) to demarcate Hala's 3D face model into muscle-primitives. These primitives are used to compose complex, yet natural, facial expressions. We have also authored an in-house tool, which allows non-programmers (for ex: artists) to manipulate the face in real-time to create expressions.
The sounds humans make while talking are symbolically captured as “phonemes”. The corresponding shapes of the lips, for these sounds (i.e. phonemes), are called “visemes”. We used existing research and observed each other (and a mirror), to develop visemes that accurately capture Arabic pronunciations. Hala can thus utilize English and Arabic visemes for accurate lip-movement and syncing. We empirically tested and evaluated our work by comparing it with previous lip movements for common Arabic utterances. Nonetheless, certain pronunciations can fire less-than-ideal visemes if they are preceded by silence.
Upon identifying and addressing these limitations, Hala has 11 new facial expressions for a more natural looking and behaving robot. This work also pioneered the first implemented subset of Arabic visemes on a robot.
-
-
-
Interference-Aware Spectrum Sharing Techniques for Next Generation Wireless Networks
Authors: Marwa Khalid Qaraqe, Ziad Bouida, Mohamed Abdallah and Mohamed-Slim AlouiniAbstractBackground: Reliable high-speed data communication that supports multimedia application for both indoor and outdoor mobile users is a fundamental requirement for next generation wireless networks and requires a dense deployment of physically coexisting network architectures. Due to the limited spectrum availability, a novel interference-aware spectrum-sharing concept is introduced where networks that suffer from congested spectrums (secondary-networks) are allowed to share the spectrum with other networks with available spectrum (primary-networks) under the condition that limited interference occurs to primary networks.
Objective: Multiple-antenna and adaptive rate can be utilized as a power-efficient technique for improving the data rate of the secondary link while satisfying the interference constraint of the primary link by allowing the secondary user to adapt its transmitting antenna, power, and rate according to the channel state information.
Methods: Two adaptive schemes are proposed using multiple-antenna transmit diversity and adaptive modulation in order to increase the spectral-efficiency of the secondary link while maintaining minimum interference with the primary. Both the switching efficient scheme (SES) and bandwidth efficient scheme (BES) use the scan-and-wait combining antenna technique (SWC) where there is a secondary transmission only when a branch with an acceptable performance is found; else the data is buffered.
Results: In both these schemes the constellation size and selected transmit branch are determined to minimized the average number of switches and achieve the highest spectral efficiency given a minimum bit-error-rate (BER), fading conditions, and peak interference constraint. For delayed sensitive applications, two schemes using power control are used: SES-PC and BES-PC. In these schemes the secondary transmitter sends data using a nominal power level, which is optimized to minimize the average delay. Several numerical examples show that the BES scheme increases the capacity of the secondary link.
Conclusion: The SES and BES schemes reach high spectral efficiency and BER performance at the expense of an increased delay. The SES-PC and BESPC minimize the average delay, satisfy the BER, and maintain a high spectral efficiency. The proposed power optimization and power control processes minimize the delay and the dropping probability especially if we extend the presented work to a multiuser scenario.
-
-
-
Conceptual Weighted Feature Extraction and Support Vector Model: A Good Combination for Text Categorization
Authors: Ali Mohamed Jaoua, Sheikha Ali Karam, Samir Elloumi and Fethi FerjaniAbstractWhile weighted features are known in information retrieval (IR) systems to be used for increasing recall during the document selection step, conceptual methods helped for finding good features. Starting from the features of a sample of Arabic news belonging to k different financial categories, and using the support vector model (SVM), k(k-1) classifiers are generated using one-against-one classification. A new document is submitted to k(k-1) different classifiers then by using the voting heuristic, is assigned to the most selected category. Categorization results obtained for two different methods for feature extraction: one based on the optimal concepts and the other based on isolated labels, proved that isolated labels generate better feature, because of the specificity of the selected features. Therefore, we can say that the quality of the feature, added to weighting methods, using SVM is an important factor for a more accurate classification. The proposed method based on isolated labels gives a good classification rate of Arabic news greater than 80% in the financial domain for five categories. Generalized to English Texts and to more categories, it becomes a good preprocessing filtering preceding automatic annotation step, and therefore helps for more accurate event structuring. Here attached a figure showing the different steps of the new categorization method.
-
-
-
Record Linkage and Fusion over Web Databases
Authors: Mourad Ouzzani, Eduard Dragut, El Kindi and Amgad MadkourAbstractMany data-intensive applications on the Web require integrating data from multiple sources (Web databases) at query time. Online sources may refer to the same real world entity in different ways and some may provide outdated or erroneous data. An important task is to recognize and merge the various references that refer to the same entity at query time. Almost all existing duplicate detection and fusion techniques work in the offline setting and, thus, do not meet the online constraint. There are at least two aspects that differentiate online duplicate detection and fusion from its offline counterpart. First, the latter assumes that the entire data is available, while the former cannot make such a hard assumption. Second, several iterations (query submissions) may be required to compute the “ideal” representation of an entity in the online setting.
We propose a general framework to address this problem: an interactive caching solution. A set of frequently requested records is cleaned off-line and cached for future references. Newly arriving records in response to a stream of queries are cleaned jointly with the records in the cache, presented to users and appended to the cache.
We introduce two online record linkage and fusion approaches: (i) a record-based and (ii) a graph-based. They chiefly differ in the way they organize data in the cache as well as computationally. We conduct a comprehensive empirical study of the two techniques with real data from the Web. We couple their analysis with commonly used cache settings: static/dynamic, cache size and eviction policies.
-
-
-
Call Admission Control with Resource Reservation for OFDM Networks
Authors: Mehdi Khabazian, Osama Kubbar and Hossam HassaneinAbstractThe scarcity of the radio resources and variable channel quality cause many challenges to the resource management for future all-IP wireless communications. One technique to guarantee a certain level of quality of service (QoS) is call admission control (CAC). Briefly, CAC is a mechanism which decides whether a new call could be admitted or rejected depending on its impacts on the current calls’ QoS. Conventional CACs such as guard channel, channel borrowing and queuing priority techniques only consider the instantaneous radio resource availabilities to make a decision on admission problem, thus they are neither able to prevent the network congestion problem nor meet the QoS requirements of different users with multi-service requirements.
In this work, we propose a new CAC technique with a future look into the needed extra resources through a reservation technique to offset the changes of the channel condition due to mobility. We show that during a call session, the needed radio resources may be increased compared with the negotiated resources during call setup. Although such fluctuations are fairly low for a single call, it is not negligible when the network is congested. As a result, some ongoing calls may experience QoS degradation. We show that such a consideration is critical in orthogonal frequency division multiplexing (OFDM) wireless networks such as 3GPP LTE where the radio resources are assigned to the users depending on the channel quality. The study assumes two types of applications denoted by wide-band and narrow-band and the performance of the proposed algorithm is modeled through queuing theory and event-driven simulation approaches. The results show that such a reservation technique improves the call admission performance significantly in terms of call blocking, call drop and call QoS degradation probabilities, and it outperforms the conventional CACs with insignificant loss in network capacity.
-
-
-
A Distributed Reconfigurable Active Solid State Drive Platform for Data Intensive Applications
Authors: Mazen Saghir, Hassan Artail, Haitham Akkary, Hazem Hajj and Mariette AwadAbstractThe ability to efficiently extract useful information from volumes of data distributed across myriad networks is hindered by the latencies inherent to magnetic storage devices and computer networks. We propose overcoming these limitations by leveraging solid-state drive (SSD) and field-programmable gate array (FPGA) technologies to process large streams of data directly at the storage sites.
Our proposed reconfigurable, active, solid-state drive (RASSD) platform consists of distributed nodes that couple SSDs with FPGAs. While SSDs store data, FPGAs implement processing elements that couple soft-core RISC processors with dynamically reconfigurable logic resources. The processors execute data-processing software drivelets, and the logic resources implement hardware for accelerating performance-critical operations. Executing appropriate drivelets and using matching hardware accelerators enables us to efficiently process streams of data stored across SSDs.
To manage the configuration of RASSD nodes and provide a transparent interface to applications, our platform also consists of distributed middleware software. Client local middleware (CLM) resides on client host machines to interpret application data processing requests, locate storage sites, and exchange data-processing requests and results with middleware servers (MWS). MWS connect to clusters of RASSD nodes and contain libraries of drivelets and accelerator configuration bit streams. An MWS loads appropriate drivelet and accelerator bit streams onto a RASSD node's FPGA, aggregates processed data, and returns it to a CLM.
To evaluate our platform, we implemented a simple system consisting of a host computer connected to a RASSD node over a peer-to-peer network. We ran a keyword search application on the host computer, which also provided middleware functionality. We then evaluated this platform under three configurations. In configuration C1, the RASSD node was only used to store data while all data was processed by the MWS running on the host computer. In configuration C2, the data was processed by a drivelet running on the RASSD node. Finally, in configuration C3, the data was processed both by a drivelet and a hardware accelerator.
Our experimental results show that C3 is 2x faster than C2, and 6x faster than C1. This demonstrates our platform's potential for enhancing the performance of data-intensive applications over current systems.
-
-
-
A Dynamic Physical Rate Adaptation for Multimedia Quality-Based Communications in IEEE_802.11 Wireless Networks
By Mariam FlissAbstractAssuming that the IEEE802.11 Wireless Local Area Networks (WLANs) are based on a radio/infrared link, they are more sensitive to the channel variations and connection ruptures. Therefore the support for multimedia applications over WLANs becomes non-convenient due to the compliance failure in term of link rate and transmission delay performance. We studied link adaptation facets and the Quality of Service (QoS) requirements essential for successful multimedia transmissions. In fact, the efficiency of rate control diagrams is linked to the fast response for channel variation. The 802.11 physical layers provide multiple transmission rates (different modulation and coding schemes). The last 802.11g-version maintains 12 physical rates up to 54 Mbps at the 2.4 GHz band. As a result, Mobile Stations (MSs) are able to select the appropriate link rate depending on the required QoS and instantaneous channel conditions to enhance the overall system performance. Hence, the implemented link adaptation algorithm symbolizes a vital fraction to achieve highest transmission capability in WLANs. “When to decrease and when to increase the transmission rate?” Are the two fundamental matters that we will be faced when designing a new physical-rate control mechanism? Many research works focus on tuning channel estimation schemes to better detect when the channel condition was improved enough to accommodate a higher rate, and then adapts its transmission rate accordingly. However, those techniques usually entail modifications on the current 802.11 standard. Another way to perform link control is based on local Acknowledgment (Ack) information for the transmitter station. Consequently, two techniques where accepted by the standard due to their efficiency and implementation simplicity.
We propose a new dynamic time-based link adaptation mechanism, called MAARF (Modified Adaptive Auto Rate Fallback). Beside the transmission frame results, the new model implements a Round Trip Time (RTT) technique to select adequately an instantaneous link rate. This proposed model is evaluated with most recent techniques adopted by the IEEE 802.11 standard: ARF (Auto Rate Fallback) and AARF (Adaptive ARF) schemes. Simulation results will be given to illustrate the link quality improvement of multimedia transmissions over Wi-Fi networks and to compare its performance with previous published results.
-
-
-
Interference Cancellation of Hop-By-Hop Beamforing for Dual-Hop MIMO Relay Networks
Authors: Fawaz AL-Qahtani and Hussein AlnuweiriAbstractCooperative communication relaying systems are gaining much interest because they can improve average link signal to nose ratio by replacing longer hops with multiple shorter hops. The method of relaying has been introduced to enable a source (i.e. mobile terminal) communicate with a target destination via a relaying (i.e. mobile terminal). Furthermore, multiple-input multiple output (MIMO) communication systems have been considered as powerful candidates for the fourth generation of wireless communication standards because they can achieve further performance improvements including an increase in the achievable spectral efficiency and the peak data rates (Multiplexing), and robustness against severe effects of fading (transmit beamforming). In this work, we consider hop-by-hop beamforming relaying system over Rayleigh fading channels. In wireless communication environments, it is well known understood that the performance of wireless networks can be limited by both fading and co-channel interference (CCI). In this work, the multiple antennas at each node of relaying systems can be used to adaptively modify the radiation pattern of the array to reduce the interference by placing nulls in the direction of the dominant interferers. In this paper, we investigate the effect of CCI on the performance of hop-by-hop beamforming relaying system. First, we derive exact closed-form expressions for the outage probability and average symbol error rates. Moreover, we look into the high signal to noise ratio (SNR) regime and study the diversity order and coding gain achieved by the system.
-
-
-
Novel Reduced-Feedback Wireless Communication Systems
Authors: Mohammad Obaidah Shaqfeh, Hussein Alnuweiri and Mohamed-Slim AlouiniAbstractModern communication systems apply channel-aware adaptive transmission techniques and dynamic resource allocation in order to exploit the peak conditions of the fading wireless links and to enable significant performance gains. However, conveying the channel state information among the users’ mobile terminals into the access points of the network consumes a significant portion of the scarce air-link resources and depletes the battery resources of the mobile terminals rapidly. Despite its evident drawbacks, the channel information feedback cannot be eliminated in modern wireless networks because blind communication technologies cannot support the ever-increasing transmission rates and high quality of experience demands of current ubiquitous services.
Developing new transmission technologies with reduced-feedback requirements is sought. Network operators will benefit from releasing the bandwidth resources reserved for the feedback communications and the clients will enjoy the extended battery life of their mobile devices. The main technical challenge is to preserve the prospected transmission rates over the network despite decreasing the channel information feedback significantly. This is a noteworthy research theme especially that there is no mature theory for feedback communication in the existing literature despite the growing number of publications about the topic in the last few years. More research efforts are needed to characterize the trade-off between the achievable rate and the required channel information and to design new reduced-feedback schemes that can be flexibly controlled based on the operator preferences. Such schemes can be then introduced into the standardization bodies for consideration in next generation broadband systems.
We have recently contributed to this field and published several journal and conference papers. We are the pioneers to propose a novel reduced-feedback opportunistic scheduling scheme that combines many desired features including fairness in resources distribution across the active terminals and distributed processing at the MAC layer level. In addition our scheme operates close to the upper capacity limits of achievable transmission rates over wireless links. We have also proposed another hybrid scheme that enables adjusting the feedback load flexibly based on rates requirements. We are currently investigating other novel ideas to design reduced-feedback communication systems.
-
-
-
Learning to Recognize Speech from a Small Number of Labeled Examples
AbstractMachine learning methods can be used to train automatic speech recognizers (ASR). When porting ASR to a new language, however, or to a new dialect of spoken Arabic, we often have too few labeled training data to allow learning of a high-precision ASR. It seems reasonable to think that unlabeled data, e.g., untranscribed television broadcasts, should be useful to train the ASR; human infants, for example, are able to learn the distinction between phonologically similar words after just one labeled training utterance. Unlabeled data tell us the marginal distribution of speech sounds, p(x), but do not tell us the association between labels and sounds, p(y|x). We propose that knowing the marginal is sufficient to rank-order all possible phoneme classification functions, before the learner has heard any labeled training examples at all. Knowing the marginal, the learner is able to compute the expected complexity (e.g., derivative of the expected log covering number) of every possible classifier function, and based on measures of complexity, it is possible to compute the expected mean-squared probable difference between training-corpus error and test-corpus error. Upon presentation of the first few labeled training examples, then, the learner simply chooses, from the rank-ordered list of possible phoneme classifiers, the first one that is reasonably compatible with the labeled examples. This talk will present formal proofs, experimental tests using stripped-down toy problems, and experimental results from English-language ASR; future work will test larger-scale implementations for ASR in the spoken dialects of Arabic.
-
-
-
Demonstration Prototype of a High-Fidelity Robotic-Assisted Suturing Simulator
Authors: Georges Younes, George Turkiyyah and Jullien Abi NahedAbstractRapid advances in robotic surgical devices have put significant pressure on physicians to learn new procedures using newer and sophisticated instruments. This in turn has increased the demand for effective and practical training methods using these technologies, and has motivated the development of surgical simulators that promise to provide practical, safe, and cost-effective environments for practicing demanding robotic-assisted procedures.
However, despite the significant interest and effort in the development of such simulators, the current state-of-art surgical simulators are lacking. They introduce significant simplifications to obtain real-time performance, and these simplifications often come at the expense of realism and fidelity. There is a need to develop and build the next-generation of surgical simulators that improve haptic and visual realism. The primary challenges for building such high-fidelity simulations for soft-tissue organ simulations come from two computationally demanding tasks that must execute in real time: managing the complexity of the geometric environment that is being dynamically modified during the procedure; and modeling the stresses and deformations of the soft tissue interacting with surgical instruments and subjected to cutting and suturing. The mechanics of soft-tissue behavior are complicated by their anisotropic and nonlinear behavior.
In this presentation, we describe an initial prototype of a robotic-assisted simulator applied to a simplified task in a prostatectomy procedure (anastomosis). The simulator demonstrates new methodologies for modeling nonlinear tissue models, integrated high-resolution geometric contact detection for handling inter- and intra-organ collisions in the dynamically changing geometric environment of the simulation, and suturing with threads. The prototype is deployed on a bi-manual haptic feedback frame and serves as a building block for simulations operating in more complex anatomical structures.
-
-
-
Investigating the Dynamics of Densely Crowded Environments at the Hajj Using Image Processing Techniques
Authors: Khurom Hussain Kiyani and Maria PetrouAbstractBackground: With the world's population projected to grow from the current 6.8 billion to around 9 billion by 2050, the resultant increase of megacities and the associated demands on public transport, there is an urgent imperative to understand the dynamics of crowded environments. Very dense crowds that exceed 3 people per square metre present many challenges for efficiently measuring quantities such as density and pedestrian trajectories. The Hajj and the associated minor Muslim pilgrimage of Umrah, present some of the most densely crowded human environments in the world, and thus present an excellent observational laboratory for the study of dense crowd dynamics. An accurate characterisation of such dense crowds cannot only improve existing models, but can also help to develop better intervention strategies for mitigating crowd disasters such as the Hajj 2006 Jamarat stampede that killed over 300 pilgrims. With Qatar set to be one of the cultural centres in the region, e.g. FIFA World Cup 2022, the proper control and management of large singular events is important for not only our safety but also our standing in the international stage.
Objectives: To use the data gathered from the Hajj to assess the dynamics of large dense crowds with a particular focus on crowd instabilities and pattern formation.
Methods: We will make use of advanced image processing and pattern recognition techniques (mathematical morphology, feature selection etc.) in assessing some of the bulk properties of crowds such as density and flow, as well as the finer details such as the ensemble of pedestrian trajectories. We are currently in the process of taking multiple wide-angle stereo videos at this year's Hajj, with our collaborators in Umm Al-Qurra University in Mecca. Multiple video capture of the same scene from different angles allows one to overcome the problem of occlusion in dense crowds.
Results: We will present our field study in the Hajj this year, where we took extensive high quality multiple camera video data. We will also present some of the techniques, which we will be using over the coming year in analyzing this large data set that we have now successfully collated.
-
-
-
Software for Biomechanical Performance Analysis of Force Plate Data
Authors: Manaf Kamil and Dino PalazziAbstractBackground: Force plates have been used in sports biomechanics since the 1960's. However, extracting useful information related to performance from curve analysis is a complicated process. It requires a combined knowledge in signal processing (filtering), mechanical equations, testing protocols, biomechanics, and discrete mathematical analysis to properly process the data.
Objectives: The aim is to provide a practical and accurate tool to analyze force curves from select standard biomechanical performance tests (e.g., counter movement jump, drop jump).
Methods The software is a tool built using Microsoft.Net framework. Key features of the software include:
* Real-time data acquisition module able to acquire data from third-party 8 channel 3D force plates with real-time results for immediate feedback during tests.
* Digital filtering module where the signal is treated for best fit for the analysis.
* Analysis module able to calculate Force, Power, Velocities, Trajectories, Mechanical Impulse and Timing during the different phases of the tests using discrete analysis algorithms.
* Reporting module for plotting and exporting selected variables.
Results The software has been used by ASPIRE Academy Sport Scientists in performance assessments of professional and semi-professional athletes from Qatar and other countries.
Currently, the software can analyze Counter Movement Jump, Drop Jump, Isometric Pulls and Squat Jump.
It contains automatic algorithms to detect specific points for each test type, but allows the user to change these suggestions when needed. Feedback is immediate, in both graphical and numerical form.
Conclusions: This novel software has proven to be a useful tool for immediate and accurate analysis and reporting of select field and lab based biomechanical tests. Going forward, further feedback from the applied users can lead to more features added. Considering the architecture of the software, adding more analysis modules is relatively simple. For example work is currently underway on a sprint running analysis module.
-
-
-
Autonomous Coverage Management in Smart Camera Networks
Authors: Vinay Kolar, Vikram Munishwar, Nael Abu-Ghazaleh and Khaled HarrasAbstractRecent advances in imaging and communication have lead to the development of smart cameras that can operate autonomously and collaboratively to meet various application requirements. Networks of such cameras, called Smart Camera Networks (SCNs), have a range of applications in areas such as monitoring and surveillance, traffic management and health care. The self-configuring nature of cameras, by adjusting their pan, tilt and zoom (PTZ) settings, coupled with wireless connectivity differentiate them substantially from classical multi-camera surveillance networks.
One of the important problems in SCNs is: “How to configure cameras to obtain the best possible coverage of events happening within the area of interest?” As the scale of cameras grows from tens to hundreds of cameras, it is impractical to rely on humans to configure cameras to best track areas of interest. Thus, supporting autonomous configuration of cameras to maximize their collective coverage is a critical requirement in SCNs.
Our research first focuses on a simplified version of the problem, where the field-of-view (FoV) of a camera can be adjusted only by adjusting its pan in discrete manner. Even with such simplifications, solving the problem optimally is NP-hard. Thus, we propose centralized, distributed and semi-centralized heuristics that outperform the state-of-the-art approaches. Furthermore, the semi-centralized approach provides coverage accuracy close to the optimal, while reducing the communication latency by 97% and 74% compared to the centralized and distributed approaches, respectively.
Next, we consider the problem without FoV constraints; we allow FoVs to be adjusted in PTZ dimensions in continuous manner. While, PTZ configurations significantly increase the coverable area, the continuous adjustment nature eliminates any sub-optimality resulting from the discrete settings. However, supporting these features typically results in generating extremely large number of feasible FoVs per camera, out of which only one optimal FoV will be selected. We show that the problem of finding minimum feasible FoVs per camera is NP-hard. However, due to the geometric constraints introduced by the camera's FoV, the problem can be solved in polynomial time. Our proposed algorithm has polynomial-time worst-case complexity of O(n3).
-
-
-
Query Processing in Private Data Outsourcing Using Anonymization
Authors: Ahmet Erhan Nergiz and Chris CliftonAbstractData outsourcing is a growing business. Cloud computing developments such as Amazon Relational Database Service promise further reduced cost. However, use of such a service can be constrained by privacy laws, requiring specialized service agreements and data protection that could reduce economies of scale and dramatically increase costs.
We propose a private data outsourcing approach where the link between identifying information and sensitive (protected) information is encrypted, with the ability to decrypt this link residing only with the client. As the server no longer has access to individually identifiable protected information, it is not subject to privacy laws, and can offer a service that does not need to be customized to the needs of each country- or sector-specific requirements; any risk of violating privacy through releasing sensitive information tied to an individual remains with the client. The data model used in this work is shown with an example in Figure 1 .
This work presents a relational query processor operating within this model. The goal is to minimize communication and client-side computation, while ensuring that the privacy constraints captured in the anatomization are maintained. At first glance, this is straightforward: standard relational query processing at the server, except that any joins involving the encrypted key must be done at the client; an appropriate distributed query optimizer should do a reasonably good job of this. However, two issues arise that confound this simple approach:
1. By making use of the anatomy groups, and the knowledge that there is a one-to-one mapping (unknown to the server) between tuples in such groups, we can perform portions of the join between identifying and sensitive information at the server without violating privacy constraints, and
2. Performing joins at the client and sending results back to the server for further processing can violate privacy constraints.
-
-
-
GreenLoc: Energy Efficient Wifi-Based Indoor Localization
Authors: Mohamed Abdellatif, Khaled Harras and Abderrahmen MtibaaAbstractUser-localization and positioning systems have been a core challenge in the domain of context-aware pervasive systems and applications. GPS has been the de-facto standard for outdoor localization; however, geo-satellite signals upon which GPS rely, are inaccurate in indoor environments. Therefore, various indoor localization techniques based on triangulation, scene analysis, or proximity, have been introduced. The most prominent technologies over which these techniques are applied include WiFi, Bluetooth, RFID, Infrared, and UWB. Due to the ubiquitous deployment of access points, WiFi-based localization via triangulation has emerged to be among the most prominent indoor positioning solutions. A major deployment obstacle for such systems, however, is the high-energy consumption rates of Wifi adapters in mobile devices where energy is the most valuable resource.
We propose GreenLoc, an indoor green localization system that exploits sensors prevalent in today's smart-phones in order to dynamically adapt the frequency of location updates required. Significant energy gains can, therefore, be acquired when users are not mobile. For example, accelerometers can aid in detecting different user states such as walking, running or stopping. Based on these states, mobile devices can dynamically decide upon the appropriate update frequency. We accommodate various motion speeds by estimating the velocity of the device using the latest two location coordinates, and the time interval between these two-recorded locations. We have taken the first steps towards implementing GreenLoc, based on the infamous Ekahau system. We have also conducted preliminary tests utilizing the accelerometer, gravity, gyroscope, and light sensors residing on the HTC Nexus One and IPhone4 smart-phones.
To further save energy in typical indoor environments, such as malls, schools, and airports, GreenLoc exploits people's proximity when moving in groups. Devices within short-range of each other do not necessarily require that they each be individually tracked. Therefore, GreenLoc detects and clusters users moving together and elects a reference node (RN) based on device energy levels and needs. The elected RN will then be tracked via triangulation while other nodes in the group will be tracked based on the RN's location using Bluetooth. Our initial analysis demonstrates very promising results with this system.
-
-
-
A Data Locality and Skew Aware Task Scheduler for MapReduce in Cloud Computing
Authors: Mohammad Hammoud, Suhail Rehman and Majd SakrAbstractInspired by the success and the increasing prevalence of MapReduce, this work proposes a novel MapReduce task scheduler. MapReduce is by far one of the most successful realizations of large-scale, data-intensive, cloud computing platforms. As compared to traditional programming models, MapReduce automatically and efficiently parallelizes computation by running multiple Map and/or Reduce tasks over distributed data across multiple machines. Hadoop, an open source implementation of MapReduce, schedules Map tasks in the vicinity of their input splits seeking diminished network traffic. However, when Hadoop schedules Reduce tasks, it neither exploits data locality nor addresses data partitioning skew inherent in many MapReduce applications. Consequently, MapReduce experiences a performance penalty and network congestion as observed in our experimental results.
Recently there has been some work concerned with leveraging data locality in Reduce task scheduling. For instance, one study suggests a locality-aware mechanism that inspects Map inputs and predicts corresponding consuming reducers. The input splits are subsequently assigned to Map tasks near the future reducers. While such a scheme addresses the problem, it targets mainly public-resource grids and doesn't fully substantiate the accuracy of the suggested prediction process. In this study, we propose Locality-Aware Skew-Aware Reduce Task Scheduler (LASAR), a practical strategy for improving MapReduce performance in clouds. LASAR attempts to schedule each reducer at its center-of-gravity node. It controllably avoids scheduling skew, a situation where some nodes receive more reducers than others, and promotes effective pseudo-asynchronous Map and Reduce phases resulting in earlier completion of submitted jobs, diminished network traffic, and better cluster utilization.
vWe implemented LASAR in Hadoop-0.20.2 and conducted extensive experimentations to evaluate its potential. We found that it outperforms current Hadoop by 11%, and by up to 26% for the utilized benchmarks. We believe LASAR is applicable to several cloud computing environments and multiple essential applications, including but not limited to shared environments and scientific applications. In fact, a large body of work observed partitioning skew in many of critical scientific applications. LASAR paves the way for these applications, and others, to get effectively ported to various clouds.
-
-
-
Multi-Layered Performance Monitoring for Cloud Computing Applications
Authors: Suhail Rehman, Mohammad Hammoud and Majd SakrAbstractCloud computing revolutionizes the way large amounts of data are processed and offers a compelling paradigm to organizations. An increasing number of data-intensive scientific applications are being ported to cloud environments such as virtualized clusters, in order to take advantage of increased cost-efficiency, flexibility, scalability, improved hardware utilization and reduced carbon footprint, among others.
However, due to the complexity of the application execution environment, routine tasks such as monitoring, performance analysis and debugging of applications deployed on the cloud become cumbersome and complex. These tasks often require close interaction and inspection of multiple layers in the application and system software stack. For example, when analyzing a distributed application that has been provisioned on a cluster of virtual machines, a researcher might need to monitor the execution of his program on the VMs, or the availability of physical resources to the VMs. This would require the researcher to use different sets of tools to collect and analyze performance data from each level.
Otus is a tool that enables resource attribution in clusters and currently reports only the virtual resource utilization and not the physical resource utilization on virtualized clusters. This is insufficient to fully understand application behavior on a cloud platform; it would fail to account for the state of the physical infrastructure, its availability or the variation in load by other VMs on the same physical host, for example.
We are extending Otus to collect metrics from multiple layers, starting with the Hypervisor. Otus can now collect information from both the VM level as well as the Hypervisor level; and this information is collected in an OpenTSDB database, which is scalable to large clusters. A web-based application allows the researcher to selectively visualize these metrics in real-time or for a particular time range in the past.
We have tested our multi-layered monitoring technique on several Hadoop Mapreduce applications and clearly identified the causes of several performance problems that would otherwise not be clear using existing methods. Apart from helping researchers understand application needs, our technique could also help accelerate the development and testing of new platforms for cloud researchers.
-
-
-
Towards a New Termination Checker for the Coq Proof Assistant
More LessAbstractModern societies rely on software applications for performing many critical tasks. As this trend is increasing, so it is the necessity to develop cost-effective methods of writing software that ensure that essential safety and security requirements are met. In this context, dependent type theories are recently gaining adoption as a valid tool for performing formal verification of software.
The focus of this work is Coq, a proof assistant based on a dependent type theory called the Calculus of Inductive Constructions. Developed at INRIA (France) for over 20 years, it is arguably one of the most successful proof assistant to this date. It has been used in several real-world large-scale projects such as the formalization of a verification framework for the Java Virtual Machine, a proof of the Four Color Theorem, and a formally verified compiler for the C programming language (project Compcert).
Coq is both a proof assistant and a programming language. To ensure soundness of the formal verification approach, Coq imposes several conditions on the source programs. In particular, all programs written in Coq must be terminating. The current implementation of the termination checker uses syntactic criteria that are too restrictive and limiting in practice, hindering the usability of the system.
In previous work we have proposed an extension of Coq using a termination checker based on the theory of sized types and we have shown that the soundness of the approach. Furthermore, compared to syntactic criteria currently used, our approach is more powerful, easier to understand, and easier to implement, as evidenced by a prototype implementation we developed.
Our objective is to turn our prototype into an implementation of a new core theory and termination checker for Coq. We expect that the resulting system will be more efficient and easier to understand for users. Furthermore it will increase the expressive power and usability of Coq, permitting the use of formal verification on a wider range of applications.
-
-
-
A Natural Language Processing-Based Active and Interactive Platform for Accessing English Language Content and Advanced Language Learning
Authors: Kemal Oflazer, Teruko Mitamura, Tomas By, Hideki Shima and Eric RieblingAbstractSmartReader is a general-purpose “reading appliance” being implemented at Carnegie Mellon University (Qatar and Pittsburgh) - building upon an earlier prototype version. It is an artificial intelligence system that employs advanced language processing technologies and can interact with the reader and respond to queries about the content, words and sentences in a text. We expect it to be used by students in Qatar and elsewhere to help improve their comprehension of English text. SmartReader is motivated by the observation that text is still the predominant medium for learning especially at the advanced level and that text, being ``bland’’, is hardly a conducive and motivating medium for learning, especially when one does not have access to tools that enable one get over language roadblocks, ranging from unknown words to unrecognized and forgotten names, to hard-to-understand sentences. SmartReader strives to make reading (English) textual material, an “active” and an “interactive” process with the user interacting with the text using anytime-anywhere contextually-guided query mechanism based-on contextual user intent recognition. With SmartReader, a user can -inquire about the contextually correct meaning or synonyms of a word or idiomatic and multi-word constructions, -select a person's name, and then get an immediate ``flashback’’ to the first (or the last) time the person was encountered in text to remind herself the details of the person, -extract a summary of a section to remember important aspects of the content at the point she left off, and continue reading with a significantly refreshed context, -select a sentence that she may not be able to understand fully and ask SmartReader to break it down, simplify or paraphrase to comprehend it better. -test her comprehension of the text in a page or a chapter, by asking SmartReader to dynamically generate quizzes and answering them. -ask questions about the content of the text and get answers in addition to many other functions. SmartReader is being implemented as a multi-platform (tablet/PC) client-server system using HTML5 technology, with Unstructured Information Management Architecture -UIMA technology (used recently in IBM's Watson Q/A system in the Jeopardy Challenge) as the underlying language processing framework.
-
-
-
NEXCEL, A Deductive Spreadsheet
More LessAbstractUsability and usefulness have made the spreadsheet one of the most successful computing applications of all times: millions rely on it every day for anything from typing grocery lists to developing multimillion dollar budgets. One thing spreadsheets are not very good at is manipulating symbolic data and helping users make decisions based on them. By tapping into recent research in logic programming, databases and cognitive psychology, we propose a deductive extension to the spreadsheet paradigm which addresses precisely this issue. The accompanying tool, which we call NEXCEL, is intended as an automated assistant for the daily reasoning and decision-making needs of computer users, in the same way as a spreadsheet application such as Microsoft Excel assists them every day with calculations simple and complex. Users without formal training in logic or computer science can interactively define logical rules in the same simple way as they define formulas in Excel. NEXCEL immediately evaluates these rules thereby returning lists of values that satisfy them, again just like with numerical formulas. The deductive component is seamlessly integrated into the traditional spreadsheet so that a user not only still has access to the usual functionalities, but is able to use them as part of the logical inference and, additionally, to embed deductive steps in a numerical calculation.
Under the hood, NEXCEL uses a small logic programming language inspired by Datalog to define derived relations: the target region of the spreadsheet contains a set of logical clauses in the same way that calculated cells contain a numerical formula in a traditional spreadsheet. Therefore, logical reasoning reduces to computing tables of data by evaluating Datalog-like definitions, a process that parallels the calculation of numerical formulas. Each row in the calculated relation is a tuple of values satisfying the definition for this relation, so that the evaluated table lists all such solutions, without repetitions. This linguistic extension significantly enriches the expressive power of the spreadsheet paradigm. Yet, it is provided to the user through a natural extension of the mostly excellent interfaces of modern spreadsheet applications.
-
-
-
Characterizing Scientific Applications on Virtualized Cloud Platforms
More LessAbstractIn general, scientific applications require different types of computing resources based on the application's behavior and needs. For example, page indexing in an Arabic search engine requires sufficient network bandwidth to process millions of web pages while seismic modeling is CPU and graphics intensive for real-time fluid analysis and 3D visualization. As a potential solution, cloud computing, with its elastic, on-demand and pay-as-you-go model, can offer a variety of virtualized compute resources to satisfy the demands of various scientific applications. Currently, deploying scientific applications onto large-scale virtualized cloud computing platforms is based on a random mapping or some rule-of-thumb developed through past experience. Such provisioning and scheduling techniques cause overload or inefficient use of the shared underlying computing resources, while delivering little to no satisfactory performance guarantees. Virtualization, a core enabling technology in cloud computing, enables the coveted flexibility and elasticity yet it introduces several difficulties with resource mapping for scientific applications.
In order to enable informed provisioning, scheduling and perform optimizations on cloud infrastructures while running scientific workloads, we propose the utilization of a profiling technique to characterize the resource need and behavior of such applications. Our approach provides a framework to characterize scientific applications based on their resource capacity needs, communication patterns, bandwidth needs, sensitivity to latency, and degree of parallelism. Although the programming model could significantly affect these parameters, we focus this initial work on characterizing applications developed using the MapReduce and Dryad programming models. We profile several applications, while varying the cloud configurations and scale of resources in order to study the particular resource needs, behavior and identify potential resources that limit performance. A manual and iterative process using a variety of representative input data sets is necessary to reach informative conclusions about the major characteristics of an application's resource needs and behavior. Using this information, we provision and configure a cloud infrastructure, given the available resources, to best target the given application. In this preliminary work, we show experimental results across a variety of applications and highlight the merit in precise application characterization in order to efficiently utilize the resources available across different applications.
-
-
-
Human-Robot Interaction in an Arabic Social and Cultural Setting
Authors: Imran Fanaswala, Maxim Makatchev, Brett Browning, Reid Simmons and Majd SakrAbstractWe have permanently deployed Hala; the world first's English and Arabic Robot Receptionist for 500+ days in an uncontrolled multi-cultural/multi-lingual environment within Carnegie Mellon Qatar.
Hala serves as a research testbed for studying the influence of socio-cultural norms and the nature of interactions during human-robot interaction within a multicultural, yet primarily ethnic Arab, setting.
Hala, as a platform, owes its uptime to several independently engineered components for modeling user interactions, syntactic and semantic language parsing, inviting users with a laser, handling facial animations, text-to-speech and lip synchronization, error handling and reporting, post dialogue analysis, networking/interprocess communication, and a rich client interface.
We conjecture that disparities in discourse, appearances, and non-verbal gestures amongst interlocutors of different cultures and native tongues. By varying Hala's behavior and responses, we gain insight into these disparities (if any) and therefore we've calculated: rate of thanks after the robot's answer amongst cultures, user willingness to answer personal questions, correlation between language and acceptance of robot invites, the duration of conversations, effectiveness of running an open-ended experiment versus surveys.
We want to understand if people communicate with a robot (rather, an inanimate object with human-like characteristics) differently than amongst themselves. Additionally, we want to extrapolate these differences/similarities while accounting for culture and language. Our results indicate that users in Qatar thanked Hala less frequently than their counterparts in the US. The robot often answered personal questions and inquiries (for ex: her marital status, job satisfaction, etc); however, only 10% of the personal questions posed by the robot were answered by users. We observed a 34% increase in interactions when the robot initiated the conversation by inviting nearby users, and the subsequent duration of the conversation also increased by 30%. Upon bringing language into the mix, we observed that native Arabic speakers were twice more likely to accept an invite from the robot and they also tended to converse for 25% longer than other cultures.
These results indicate a disparity in interaction across English and Arabic users thereby encouraging the creation of culture specific dialogues, appearances and non-verbal gestures for an engaging social robot with regionally relevant applications.
-
-
-
Overcoming Machine Tools Blindness by a Dedicated Computer Vision System
Authors: Hussien J Zughaer and Ghassan Al-KindiAbstractAlthough Computerized Numerical Controlled (CNC) machines are currently regarded as the heart of machining workshops they are still suffering from machine blindness, hence, they cannot automatically judge the performance of applied machining parameters or monitor tool wear. Therefore, parts produced on these machines may not be as precise as expected. In this research an innovative system is developed and successfully tested to improve the performance of CNC machines. The system utilizes twin-camera computer vision system. This vision system is integrated with the CNC machine controller to facilitate on-line monitoring and assessment of machined surfaces. Outcome from the monitoring and assessment task of is used to real-time control of the applied machining parameters by a developed decision making subsystem which automatically decides whether to keep or alter the employed machining parameters or to apply tool change. To facilitate the integration of computer vision with CNC machines a comprehensive system is developed to tackle a number of pinpointed issues that obstruct such integration including scene visibility issue (e.g. effects of coolant and cut chips as well as camera mounting and lighting), effects of machine vibration on the quality of obtained roughness measurement, selection of a the most proper roughness parameter to be employed, and assessment of machining parameters effects on acquired roughness measurement. Two system rigs employing different models of CNC machines are practically developed and employed in the conducted tests to beneficially generalize the findings. Two cameras are mounted on the machine spindle of each of the two employed CNC machines to provide valid image data according to the cutting direction. Proper selection and activation of relative camera is achieved automatically by the developed system which analyze the most recent conducted tool path movement to decide on which camera is to be activated. In order to assess the machining surface quality and cutting tool status, image data are processed to evaluate resulting tool imprints on the machined surface. An indicating parameter to assess resulting tool imprints is proposed and used. The overall results show the validity of the approach and encourage further development to realize wider scale applications of vision-based-CNC machines.
-
-
-
EEG - Mental Task Discrimination by Digital Signal Processing
More LessAbstractRecent advances in computer hardware and signal processing have made possible the use of EEG signals or “brain waves” for communication between humans and computers. Locked-in patients have now a way to communicate with the outside world, but even with the last modern techniques, such systems still suffer communication rates on the order of 2-3 tasks/minute. In addition, existing systems are not likely to be designed with flexibility in mind, leading to slow systems that are difficult to improve.
This Thesis is classifying different mental tasks through the use of the electroencephalogram (EEG). EEG signals from several subjects through channels (electrodes) have been studied during the performance of five mental tasks: Baseline task for which the subjects were asked to relax as much as possible, Multiplication task for which the subjects were given nontrivial multiplication problem without vocalizing or making any other movements, Letter composing task for which the subject were instructed to mentally compose a letter without vocalizing (imagine writing a letter to a friend in their head),Rotation task for which the subjects were asked to visualize a particular three-dimensional block figure being rotated about its axis, and Counting task for which the subjects were asked to imagine a blackboard and to visualize numbers being written on the board sequentially.
vvvThe work presented here maybe a part of a larger project, with a goal to classify EEG signals belonging to a varied set of mental activities in a real time Brain Computer Interface, in order to investigate the feasibility of using different mental tasks as a wide communication channel between people and computers.
-
-
-
Joint Hierarchical Modulation and Network Coding for Two Way Relay Network
Authors: Rizwan Ahmad, Mazen O. Hasna and Adnan Abu-DayyaAbstractCooperative communications has gained a lot of attention in the research community recently. This was possible due to the fact that the broadcast nature of wireless networks, which was earlier considered a drawback, can now be used to provide spatial diversity to increase throughput, reduce energy consumption and provide network resilience. The main drawback of cooperative communication is that it requires more bandwidth compared to traditional communication networks. Decode and Forward (DF) is one of the cooperative communications forwarding strategy where the relay nodes first decodes the data and then retransmit it to the destination. DF requires advanced techniques, which can be used at the intermediate relay nodes to improve spectrum utilization.
Some well-known techniques for spectrum efficiency are Network Coding (NC) and Hierarchical Modulation (HM). In NC technique, nodes in a network are capable of combining packets for a transmission, thus reducing number of transmissions. HM is a technique which allows transmission of multiple data streams simultaneously. Both HM and NC are useful techniques for spectral efficiency.
In this work, we evaluate the performance of a joint HM and NC scheme for two-way relay networks. The relaying is based on the signal-to-noise (SNR) threshold at relay. In particular, a two way cooperative network with two sources and one relay is considered as shown in fig 1. Two different protection classes are modulated by a hierarchical 4/16 - Quadrature Amplitude Modulation (QAM) constellation at the source. Based on the instantaneous received SNR at the relay, the relay decides to retransmit both classes by using a hierarchical 4/16 - QAM constellation, or the more-protection class by using a Quadrature Phase Shift Keying (QPSK) constellation, or remains silent. These thresholds at the relay give rise to multiple transmission scenarios in a two-way cooperative network. Simulation results are provided to verify the analysis.
-
-
-
Repairing Access Control Configurations via Answer Set Programming
Authors: Khaled Mohammed Khan and Jinwei HuAbstractAlthough various access control models have been proposed, access control configurations are error prone. There is no assurance of the correctness of access control configurations. When we find errors in an access control configuration, we take immediate actions to repair the configuration. The repairing is difficult, largely because arbitrary changes to the configuration may result in no less threat than errors do. In other words, constraints are placed on the repaired configuration. The presence of constraints makes a manual error-and-trial approach less attractive. There are two main shortcomings with the manual approach. Firstly, it is not clear whether the objectives are reachable at all; if not, we waste time trying to repair an error prone configuration. Secondly, we have no knowledge of the quality of the solution such as correctness of the repair.
In order to address these shortcomings, we aim to develop an automated approach to the repairing task of access control configurations. We have utilized answer set programming (ASP), a declarative knowledge representation paradigm, to support such an automated approach. The rich modeling language of ASP enables us to capture and express the repairing objectives and the constraints. In our approach, the repairing instance is translated into an ASP, and the ASP solvers are invoked to evaluate it.
Although the applications of ASP follow the general “encode-compute-extract” approach, they differ in the representations of the problems in ASP. In our case, there are two principal factors which render the proposed problem and approach non-trivial. Firstly, we need to identify constraints which are not only amendable to ASP interpretation, but also expressive enough to capture common idioms of security and business requirements; there is a trade-off to make. Secondly, our ASP programs should model the quality measure of repairs—when more than one repair is possible, the reported one is optimized in terms of the quality measure. We have also undertaken extensive experiments on both real-world and synthetic data-sets. The experiment results validate the effectiveness and efficiency of the automated approach.
-
-
-
A Security Profile-based Assurance Framework for Service Software
More LessAbstractA service software is a self-contained, modular application deployed over standard computing platforms, and readily accessible by users within or across organizational boundaries using Internet. For businesses to open up their applications for the interaction with other service software, a fundamental requirement is that there has to be sufficient choices for security provisions allowing service consumers to select and verify the actual security assurances of services. In this context, the specific research challenge is how we could design service software focusing on the consumer's specific security requirements, and provide assurances to those security needs. Clearly, the security requirements vary from consumers to consumers. This work outlines a framework focusing on the selection of service software consistent with the security requirements of consumers, and compatibility checking of the assurances provided by services. We use profile-based compatibility analysis techniques to form an essential building block towards assuring security of service software.
In our research, we envision a security profile based compatibility checking that focuses more on automatic analysis of security compatibility using formal analysis techniques of security properties of software services. Our approach is based on three main building blocks: reflection of security assurances; selection of preferred assurances; and checking of security compatibility. Clearly, our vision and research for service security based on profile based compatibility analysis will form an essential building block towards realizing the full potential of service oriented computing. We foresee that the provision of the proposed scheme for service security profiling and compatibility analysis will significantly advance the state of practice in service oriented computing. At the same time, its development represents a new and highly challenging research target in the area.
This work is of great significance to the development of future software systems that facilitate security-aware cross-organizational business activities. The envisioned capability to integrate service software across-organizational boundaries that meets security requirements of all parties involved represents a significant technological advance in enabling practical business-to-business computing, leading to new business opportunities. At the same time, the approach will make significant scientific advancement in understanding the problem of application-level system security in a service oriented computing context.
-
-
-
Hierarchical Clustering for Keyphrase Extraction from Arabic Documents Based on Word Context
Authors: Rehab Duwairi, Fadila Berzou and Souad MecheterAbstractKeyphrase extraction is a process by which the set of words or phrases that best describe a document is specified. The phrases could be extracted from the document words itself, or they could be external and specified from an ontology for a given domain. Extracting keyphrases from documents is critical for many applications such as information retrieval, document summarization or clustering. Many keyphrase extractors view the problem as a classification problem and therefore they need training documents (i.e. documents which their keyphrases are known in advance). Other systems view keyphrase extraction as a ranking problem. In the latter approach, the words or phrases of a document are ranked based on their importance and phrases with high importance (usually located at the beginning of the list) are recommended as possible keyphrases for a document.
This abstract explains Shihab; a system for extracting keyphrases from Arabic documents. Shihab views keyphrase extraction as a ranking problem. The list of keyphrases is generated by clustering the phrases of a document. Phrases are built from words which appear in the document. These phrases consist of 1-, 2- or 3-words. The idea is to group phrases which are similar into one cluster. The similarity between phrases is determined by calculating the Dice value of their corresponding contexts. A phrase context is the sentence in which that phrase appears. Agglomerative hierarchical clustering is used in the clustering phase. Once the clusters are ready, then each cluster will nominate a phrase to the set of candidate keyphrases. This phrase is called cluster representative and is determined according to a set of heuristics. Shihab results were compared with other existing keyphrase extractors such as KP-Miner and Arabic-KEA and the results were encouraging.
-
-
-
On the Design of Learning Games and Puzzles for Children with Intellectual Disability
Authors: Aws Yousif Fida El-Din and Jihad Mohamed AljaamAbstractThe objective of this paper is to present the edutainment learning games that we are developing for Qatari children with moderate intellectual disability. These games will help them to learn effectively in funny and enjoyable ways. We use multimedia technology merged with intelligent algorithms to guide the children in play. As the number of children with intellectual disability is increasing, an early intervention to teach them properly using information technology is very important. Few research projects on disability are being conducted in the Arab world however, and these projects are still not enough to respond to the real needs of the disabled and achieve satisfactory outcomes. Developing edutainment games for children with intellectual disability is a very challenging task. First, it requires content developed by specialized instructors. Second, the interface design of the games must be presented clearly and be easy to interact with. Third, the games must run slowly, in order to give the children some time to think and interact. Fourth, regardless of the results, the game must allow a minimum level of general satisfaction, to avoid depressing the children. Fifth, the game must make maximum use of multimedia elements to draw the attention of the children. We show some multimedia applications for children with different disabilities, which were developed in Qatar University (enhancing mathematics skills with symbolic gift reward in case of guessing the answer). This feature motivated the children to play the game several times in the day. The applications also used some videos to illustrate the game before they play it. The purpose of the second multimedia application is to test the children's memory. The application uses different multimedia elements to present different games, which requires deep concentration in order to guess the answer. These games helped the children develop a strong sense of self-confidence. Learning puzzles that we have developed were based on intelligent algorithms to avoid cycling and which allow the children to reach a solution. Two different approaches were used: Simulated Annealing and Tabu Search.
-
-
-
Minimal Generators Based Algorithm for Text Features Extraction: A More Efficient and Large Scale Approach
Authors: Samir Elloumi, Fethi Fejani, Sadok Ben Yahia and Ali JaouaAbstractIn the recent years, several mathematical concepts were successfully explored in computer science domain, as basis for finding original solutions for complex problems related to knowledge engineering, data mining, information retrieval, etc.
Thus, Relational Algebra (RA) and Formal Concept Analysis (FCA) may be considered as useful mathematical foundations that unified data and knowledge in information retrieval systems. As for example, some elements in a fringe relation (related to the RA domain) called isolated points were successfully of use in FCA as formal concept labels or composed labels. Once associated to words, in a textual document, these labels constitute relevant features of a text. Here, we propose the GenCoverage algorithm for covering a Formal Context (as a formal representation of a text) based on isolated labels and we use these labels (or text features) for categorization, corpus structuring and micro-macro browsing as an advanced functionality in the information retrieval task.
The main thrust of the introduced approach heavily relies on the snugness connection between isolated points and minimal generators (MGs). MGs stand at the antipodes of the closures within their respective equivalence classes. Relying on the fact the minimal generators are the smallest elements within an equivalence class, so their detection/traversal is largely eased and permits a swift building of the coverage. Thorough carried out experiments provide empirical evidences about the performances of our approach.
-
-
-
Preserving Privacy from Unsafe Data Correlation
Authors: Bechara Al Bouna, Christopher Clifton and Qutaibah MalluhiAbstractWith the emergence of cloud computing, providing safe data outsourcing has become an active topic. Several regulations have been issued to foresee that individual and corporate information would be kept private in a cloud computing environment. To guarantee that these regulations are fully maintained, the research community proposed new privacy constraints such as k-anonymity, l-diversity and t-closeness. These constraints are based on generalization which, transforms identifying attribute values into a general form and partitions to eliminate possible linking attacks. Despite their efficiency, generalization techniques affect severely the quality of outsourced data and their correlation. To cope with such defects, Anatomy has been proposed. Anatomy releases quasi-identifiers values and sensitive values into separate tables which, essentially preserve privacy and at the same time capture large amount of data correlation. However, there are situations where data correlation could lead to an unintended leak of information. In this example, if an adversary knows that patient Roan (P1) takes a regular drug, the join of Prescription (QIT) and Prescription (SNT) on the attribute GID leads to the association of RetinoicAcid to patient P1 due to correlation.
In this paper, we present a study to counter privacy violation due to data correlation and at the same time improve aggregate analysis. We show that privacy requirements affect table decomposition based on what we call correlation dependencies. We propose a safe grouping principle to ensure that correlated values are grouped together in unique partitions that obey to l-diversity and at the same time preserve the correlation. An optimization strategy is designed as well to reduce the number of anonymized tuples. Finally, we extended the UTD Anonymization Toolbox to implement the proposed algorithm and demonstrate its efficiency.
-
-
-
News Alerts Trigger System to Support Business Owners
Authors: Jihad Mohamad Aljaam, Khaled Sohil Alsaeed and Ali Mohamad JaouaAbstractThe exponential growth of financial news coming from different sources makes getting effective benefit from them very hard. Business decision makers who reply to these news, are unable to follow them accurately in real time. They need always to be alerted immediately for any potential financial events that may affect their businesses whenever they occur. Many important news can simply be embedded into thousands of lines of financial data and cannot be detected easily. Such kind of news may have sometimes a major impact on businesses and the key decision makers should be alerted about them. In this work, we propose an alert system that screens structured financial news and trigger alerts based on the users profiles. These alerts have different priority levels: low, medium and high. Whenever the alert priority level is high, a quick intervention should be taken to avoid potential risks on businesses. Such events are considered as tasks and should be treated immediately. Matching users profiles with news events can sometimes be straightforward. It can also be challenging especially whenever the keywords in the users profiles are just synonyms of the events keywords. In addition, alerts can sometimes be dependable on the combination of correlated news events coming from different sources of information. This makes their detection a computationally challenging problem. Our system allows the user to define their profiles in three different ways: (1) selecting from a list of keywords that are related to events properties; (2) providing free keywords; and (3) entering simple short sentences. The system triggers alerts immediately whenever some news events related to the users profiles occur. It takes into consideration the correlated events and the concordance of the events keywords with the synonymous of the users profiles keywords. The system uses the vector space model to match keywords with the news events words. As consequence, the rate of false-positive alerts is low whereas the rate of false-negative alerts is relatively high. However, enriching the dictionary of synonym words would reduce the false-negative alerts rate.
-
-
-
Assistive Technology for People with Hearing and Speaking Disabilities
AbstractThe community with Hearing or Speaking Disabilities represents a significant component of the society that needs to be well integrated in order to foster great advancements through leveraging all contributions of every member in the society. When those people cannot read lips they usually need interpreters to help them communicate with people who do not know sign language, and they also need an interpreter when they use phones, because the communication will not be done easily if they are not using special aiding devices, like a Relay Service or Instant Messaging (IM). As the number of people with hearing and speaking disabilities is increasing significantly; building bridges of communications between deaf and hearing community is essential, to deepen the mutual cooperation in all aspects of life. The problem could be summarized in one question: How to construct this bridge to allow people with hearing and speaking difficulties communicate?
This project suggests an innovative framework that contributes to the efficient integration of people with hearing disabilities with the society by using wireless communication and mobile technology. This project is completely operator independent unlike the existing solutions (Captel and Relay Service), it depends on an extremely powerful Automatic Speech Recognition and Processing Server (ASRPS) that can process speech and transform it into text. Independent means, it recognizes the voice regardless of the speaker and the characteristics of his/ her voice. On the other side there will be a Text To Speech (TTS) engine, which will take the text sent to the application server and transmit it as speech. The second aim of the project is to develop an iPhone/iPad application for the hearing impaired. The application facilitates the reading of the received text by converting it into sign language animations, which are constructed from a database; we are currently using American Sign Language for its simplicity. Nevertheless, the application can be further developed in other languages like Arabic sign language and British sign language. The application also assists the writing process by developing a customized user interface for the deaf to communicate efficiently with others that includes a customized keyboard.
-
-
-
Using Cognitive Dimensions Framework to Evaluate Constraint Diagrams
Authors: Noora Fetais and Peter ChengAbstractThe Cognitive Dimensions of Notations are a heuristic framework created by Thomas Green for analysing the usability of notational systems. Microsoft used this framework as a vocabulary for evaluating the usability of their C# and .NET development tools. In this research we used this framework to compare the evaluation of the Constraint Diagrams and the evaluation of the Natural Language by running a usability study. The result of this study will help in determining if users would be able to use constraint diagrams to accomplish a set of tasks. From this study we can predicate difficulties that may be faced when working on these tasks. Two steps were required. The first step is to decide what generic activities a system is desired to support. An Activity is described at a rather abstract level in terms of the structure of information and constraints on the notational environment. Cognitive dimensions constitute method to theoretically evaluate the usability of a system. Its dimensional checklist approach is used to improve different aspects of the system. Each improvement will be associated with a trade-off cost on other aspects. Each generic activity has its own requirements in terms of cognitive dimensions, so the second step is to scrutinize the system and determine how it lies on each dimension. If the two profiles match, all is well. Every dimension should be described with illustrative examples, case studies, and associated advice for designers. In general, an activity such as exploratory design where software designers make changes at different levels is the most demanding activity. This means that dimensions such as viscosity and premature-commitment must be low while visibility and role-expressiveness must be high.
-
-
-
IT System for Improving Capability and Motivation of Workers: An Applied Study with Reference to Qatar
Authors: Hussein Alsalemi and Pam MayhewAbstractInformation Systems (IS) is a discipline that has its roots in the field of organizational development. Information Technology (IT) is a primary tool that has been used by IS to support the aim of developing organisations. However, IT has focused on supporting two main areas in organisations to help them become better at achieving their goals.These two areas are: operational and strategic. Little research has been devised to support the Human (workforce) for the aim of improving organization’s abilities to achieve their goals. In IS the Socio-Technical Theory is one theory that researches approaches to improve employees' satisfaction for the sake of better work productivity and business value. This theory is based on the idea of achieving harmonious integration of different subsystems in an organization (social, technical and environmental subsystems).
The aim of this research is to find out if IT can be used to improve the capability and motivation of the workforce in organisations so that these organizations can better achieve their goals.
This research is an applied study with reference to the Qatar National Vision 2030(QNV2030) Human Pillar. Using grounded theory (GTH) research methodology, the research characterized the main factors that affect capability and motivation of the Qatari workforce. These findings were used to develop the theoretical model. This model identifies core factors, gives a description of each factor and explains interactions between them. The theoretical model was then transformed into an IT system consisting of a number of IT tools, which was then tested in different organizations in Qatar. The test was to evaluate its effectiveness in improving the capabilities and motivation of Qatari workforce and to explore its effectiveness in helping these organizations better achieve their goals.
The research concluded that the developed IT system based on the theoretical model can help in improving motivation and capability of a workforce providing a defined set of organizational and leadership qualities exist within the organisation.
-
-
-
Utilization of Mixtures as Working Fluids in Organic Rankine Cycle
Authors: Mirko Stijepovic, Patrick Linke, Hania Albatrni, Rym Kanes, Umaira Nisa, Huda Hamza, Ishrath Shiraz and Sally El MeragawiAbstractOver the past several years ORC processes have become very promising for power production from low grade heat sources: solar, biomass, geothermal and waste heat. The key challenge in the design process is the selection of an appropriate working fluid. A large number of authors used pure components as working fluid, and assess ORC performance.
ORC systems that use single working fluid component have two major shortcomings. First, the majority of applications involve temperatures of the heat sink and source fluid varying during the heat transfer process, whereas the temperatures of the working fluid during evaporation and condensing remains constant. As a consequence a pinch point is encountered in the evaporator and condenser, giving rise to large temperature differences at one end of the heat exchanger. This leads to irreversibility that in turn reduces process efficiency. A similar situation is also encountered in the condenser. A second shortcoming of the Rankine cycle is lack of flexibility.
These shortcomings result from a mismatch between thermodynamic properties of pure working fluids, the requirements imposed by the Rankine cycle and the particular application. In contrast, when working fluid mixtures are used instead of single component working fluids, improvements can be obtained in two ways, through inherent properties of the mixture itself, and through cycle variations which, become available with mixtures. The most obvious positive effect is decrease in energy destruction, since the occurrence of a temperature glide during a phase change provides a good match of temperature profiles in the condenser and evaporator.
This paper presents detailed simulations and economic analyses of Organic Rankine Cycle processes for energy conversion of low heat sources. The paper explores the effect of mixture utilization on common ORC performance assessment criteria in order to demonstrate advantages of employing mixtures as working fluid as compared to pure fluids. We illustrate these effects based on of zeotropic mixtures of paraffins, as ORC working fluids.
-
-
-
Application of Nanotechnology in Hybrid Solar Cells
Authors: Narendra Kumar Agnihotra and S SakthivelAbstractPlastic/organic /polymer photovoltaic solar cells are fourth generation cells however the efficiency, thermal stability and cost of fourth generation solar cells are still not sufficient to replace conventional solar cells. Hybrid solar cells have been one of the alternate technologies to harness solar power into electrical power to overcome the high cost of conventional solar cells. This review paper has focused on the concept of hybrid solar cells with the combination of organic/polymer materials, blended with inorganic semiconducting materials. The paper presents the importance of nanoscale materials and its shape and size, nanotubes, nanowire, nanocrystal, which can increase the efficiency of the solar cells. The study shows that nanomaterials have immense potential and application of nanomaterials (inorganic/organic/polymer) can improve the performance of photovoltaic solar cells. Tuning of nanomaterials increase the functionality, band gap, optical absorption and shape of the materials, in multiple orders compared to micro scale materials. Hybrid solar cells have unique properties of inorganic semiconductors along with the film forming properties of conjugated polymers. Hybrid materials have great potential because of their unique properties and are showing great results at the preliminary stages of research. The advantage of organic/polymer is easy processing; roll to roll production, lighter weight, flexible shape and size of the solar cells. Application of nanotechnology in hybrid solar cells has opened the door to manufacturing of a new class of high performance devices.
-
-
-
Optimized Energy Efficient Content Distribution Over Wireless Networks with Mobile-to-Mobile Cooperation
Authors: Elias Yaacoub, Fethi Filali and Adnan Abu-DayyaAbstractMajor challenges towards the development of next generation 4G wireless networks include fulfilling the foreseeable increase in power demand of future mobile terminals (MTs) in addition to meeting the high throughput and low latency requirements of emerging multimedia services. Studies show that the high-energy consumption of battery operated MTs will be one of the main limiting factors for future wireless communication systems. Emerging multimedia applications require the MTs’ wireless interfaces to be active for long periods while downloading large data sizes. This leads to draining the power of the batteries.
The evolution of MTs with multiple wireless interfaces helps to deal with this problem. This results in a heterogeneous network architecture with MTs that actively use two wireless interfaces: one to communicate with the base station (BS) or access point over a long-range (LR) wireless technology (e.g., UMTS/HSPA, WiMAX, or LTE) and one to communicate with other MTs over a short-range (SR) wireless technology (e.g., Bluetooth or WLAN). Cooperative wireless networks proved to have a lot of advantages in terms of increasing the network throughput, decreasing the file download time, and decreasing energy consumption at MTs due to the use of SR mobile-to-mobile collaboration (M2M). However, the studies in the literature apply only to specific wireless technologies in specific scenarios and do not investigate optimal strategies.
In this work, we consider energy minimization in content distribution with M2M collaboration and derive the optimal solution in a general setup with different wireless technologies on the LR and SR. Scenarios with multicasting and unicasting are investigated. Content distribution delay is also analyzed. Practical implementation aspects of the cooperative techniques are studied and different methods are proposed to overcome the practical limitations of the optimal solution. Simulation results with different technologies on the LR and SR are generated, showing significant superiority of the proposed techniques. Ongoing work is focusing on incorporating quality of service constraints in the energy minimization problem and in designing a testbed validating the proposed methods.
-
-
-
Design of Novel Gas-to-Liquid Reactor Technology Utilizing Fundamental and Applied Research Approaches
Authors: Nimir Elbashir, Aswani Mogalicherla and Elfatih ElmalikAbstractThis research work is in line with the State of Qatar's aspiration on becoming the “gas capital of the world”, as it focuses on developing cost effective Gas-to- Liquid (GTL) technologies via Fischer-Tropsch synthesis (FTS). The objective of our present research activities is developing a novel approach to the FTS reactor design through controlling the thermo-physical characteristics of the reaction media by the introduction of a supercritical solvent.
The research is facilitated by QNRF through the flagship National Priorities Research Program, highlighting the importance of the subject matter to Qatar. It is a multidisciplinary consortium comprising of highly specialized teams of foremost scientists in their fields from four universities.
FTS is the focal process in which natural gas is converted to ultra-clean liquid based fuels; it is a highly complex chemical reaction where synthesis gas (a mixture of Hydrogen & Carbon Monoxide) enters the reactor and propagates to various hydrocarbons over a metallic based catalyst. Many factors impede the current commercial FTS technologies, chiefly transport and thermal limitations due to the nature of the phase of operation (either liquid or gas phase classically). Interestingly, the most advanced FTS technologies that employ either the liquid phase or gas phase are currently in operation in Qatar (Shell's Pearl project and Sasol's Oryx GTL plant).
This project is concerned with the design of an FTS reactor to be operated under supercritical fluid conditions in order to leverage certain advantages over the aforementioned commercial technologies. The conception of designing this novel reactor is based on phase behavior and kinetic studies of the non-ideal SCF media in addition to a series of process integration and optimization studies coupled with the development of sophisticated dynamic control systems. These results are currently under use at TAMUQ to build a bench-scale reactor to verify simulation studies.
To date, our collective research has yielded 8 peer-reviewed publications, more than 8 conference papers and proceedings, as well as numerous presentations in international conferences. It is noteworthy to mention that an advisory board composed of experts from the world leading energy companies follows the progress of this project toward its ultimate goal.
-
-
-
Hierarchical Cellular Structures with Tailorable Proparties
Authors: Abdel Magid Hamouda, Amin Ajdari, Babak Haghpanah Jahromi and Ashkan VaziriAbstractHierarchical structures are found in many natural and man-made materials [1]. This structural hierarchy play an important role in determining the overall mechanical behavior of the structure. It has been suggested that increasing the hierarchical level of a structure will result in a better performing structure [2]. Besides, honeycombs are well known structures for lightweight and high strength applications [3]. In this work, we have studied the mechanical properties of honeycombs with hierarchical organization using theoretical, numerical, and experimental methods. The hierarchical organization is made by replacing the edges of a regular honeycomb structure with smaller regular honeycomb. Our results showed that honeycombs with structural hierarchy have superior properties compared to regular honeycombs. The results show that a relatively broad range of elastic properties, and thus behavior, can be achieved by tailoring the structural organization of hierarchical honeycombs, and more specifically the two dimension ratios. Increasing the level of hierarchy provides a wider range of achievable properties. Further optimization should be possible by also varying the thickness of the hierarchically introduced cell walls, and thus the relative distribution of the mass, between different hierarchy levels. These hierarchical honeycombs can be used in development of novel lightweight multifunctional structures, for example as the cores of sandwich panels, or development of lightweight deployable energy systems.
-