Compare commits
10 Commits
1b78f1904d
...
v1.2.1
| Author | SHA1 | Date | |
|---|---|---|---|
| c7d044beed | |||
| 76df19f332 | |||
| 22695a2281 | |||
| f1dd9bd184 | |||
| dd2ac058af | |||
| 98c749477c | |||
| 92973da658 | |||
| f82978e2e8 | |||
| 7007478c3b | |||
| fb728d5033 |
4
.gitignore
vendored
4
.gitignore
vendored
@@ -175,3 +175,7 @@ cython_debug/
|
|||||||
.pypirc
|
.pypirc
|
||||||
|
|
||||||
/individual_images
|
/individual_images
|
||||||
|
*.xlsx
|
||||||
|
*.csv
|
||||||
|
*.snirf
|
||||||
|
*.json
|
||||||
38
changelog.md
38
changelog.md
@@ -1,7 +1,29 @@
|
|||||||
|
# Version 1.2.1
|
||||||
|
|
||||||
|
- Added a requirements.txt file to ensure compatibility
|
||||||
|
- Added new options 'Missing Events Bypass' and 'Analysis Clearing Bypass' to the Preferences Menu
|
||||||
|
- Missing Events Bypass allows comparing events in the Group Viewers even if not all participants in the group have the event present. Fixes [Issue 28](https://git.research.dezeeuw.ca/tyler/flares/issues/28)
|
||||||
|
- Clicking Process after an analysis has been performed will now clear the existing analysis by default with a popup warning that the analysis will be cleared
|
||||||
|
- Analysis Clearing Bypass will prevent the popup and will not clear the existing analysis data. Fixes [Issue 41](https://git.research.dezeeuw.ca/tyler/flares/issues/41)
|
||||||
|
- Clicking 'Clear' should now actually properly clear all data. Hopefully fixes [Issue 9](https://git.research.dezeeuw.ca/tyler/flares/issues/9) for good
|
||||||
|
- Setting SHORT_CHANNEL to False will now grey out SHORT_CHANNEL_REGRESSION, as it is impossible to regress what does not exist. Sets SHORT_CHANNEL_REGRESSION to False under the hood when it is greyed out regardless of what is displayed. Fixes [Issue 47](https://git.research.dezeeuw.ca/tyler/flares/issues/47)
|
||||||
|
- Projects can now be saves if files have different parent folders. Fixes [Issue 48](https://git.research.dezeeuw.ca/tyler/flares/issues/48)
|
||||||
|
- It is no longer possible to attempt a save before any data has been processed. A popup will now display if a save is attempted with nothing to save
|
||||||
|
- Fixed a bug where LONG_CHANNEL_THRESH was not being applied in the processing steps
|
||||||
|
- Added a new option in the Analysis window for Group Functional Connectivity. Implements [Issue 50](https://git.research.dezeeuw.ca/tyler/flares/issues/50)
|
||||||
|
- Group Functional connectivity is still in development and the results should currently be taken with a grain of salt
|
||||||
|
- A warning is displayed when entering the Group Functional Connectivity Viewer disclosing this
|
||||||
|
- Fixed a bug when updating optode positions that would prevent .txt files from being selected. Fixes [Issue 54](https://git.research.dezeeuw.ca/tyler/flares/issues/54)
|
||||||
|
- Fixed a bug where the secondary download server would never get contacted if the primary failed
|
||||||
|
- Automatic downloads will now ignore prerelease versions. Fixes [Issue 52](https://git.research.dezeeuw.ca/tyler/flares/issues/52)
|
||||||
|
|
||||||
|
|
||||||
# Version 1.2.0
|
# Version 1.2.0
|
||||||
|
|
||||||
|
- This is a save-breaking release due to a new save file format. Please update your project files to ensure compatibility. Fixes [Issue 30](https://git.research.dezeeuw.ca/tyler/flares/issues/30)
|
||||||
- Added new parameters to the right side of the screen
|
- Added new parameters to the right side of the screen
|
||||||
- These parameters include SHOW_OPTODE_NAMES, SECONDS_TO_STRIP_HR, MAX_LOW_HR, MAX_HIGH_HR, SMOOTHING_WINDOW_HR, HEART_RATE_WINDOW, BAD_CHANNELS_HANDLING, MAX_DIST, MIN_NEIGHBORS, L_TRANS_BANDWIDTH, H_TRANS_BANDWIDTH, RESAMPLE, RESAMPLE_FREQ, STIM_DUR, HRF_MODEL, HIGH_PASS, DRIFT_ORDER, FIR_DELAYS, MIN_ONSET, OVERSAMPLING, SHORT_CHANNEL_REGRESSION, NOISE_MODEL, BINS, and VERBOSITY.
|
- These parameters include SHOW_OPTODE_NAMES, SECONDS_TO_STRIP_HR, MAX_LOW_HR, MAX_HIGH_HR, SMOOTHING_WINDOW_HR, HEART_RATE_WINDOW, BAD_CHANNELS_HANDLING, MAX_DIST, MIN_NEIGHBORS, L_TRANS_BANDWIDTH, H_TRANS_BANDWIDTH, RESAMPLE, RESAMPLE_FREQ, STIM_DUR, HRF_MODEL, HIGH_PASS, DRIFT_ORDER, FIR_DELAYS, MIN_ONSET, OVERSAMPLING, SHORT_CHANNEL_REGRESSION, NOISE_MODEL, BINS, and VERBOSITY.
|
||||||
|
- Certain parameters now have dependencies on other parameters and will now grey out if they are not used
|
||||||
- All the new parameters have default values matching the underlying values in version 1.1.7
|
- All the new parameters have default values matching the underlying values in version 1.1.7
|
||||||
- The order of the parameters have changed to match the order that the code runs when the Process button is clicked
|
- The order of the parameters have changed to match the order that the code runs when the Process button is clicked
|
||||||
- Moved TIME_WINDOW_START and TIME_WINDOW_END to the 'Other' category
|
- Moved TIME_WINDOW_START and TIME_WINDOW_END to the 'Other' category
|
||||||
@@ -15,6 +37,20 @@
|
|||||||
- Fixed the User Guide window to properly display information about the 24 stages and added a link to the Git wiki page
|
- Fixed the User Guide window to properly display information about the 24 stages and added a link to the Git wiki page
|
||||||
- MAX_WORKERS should now properly repect the value set
|
- MAX_WORKERS should now properly repect the value set
|
||||||
- Added a new CSV export option to be used by other applications
|
- Added a new CSV export option to be used by other applications
|
||||||
|
- Added support for updating optode positions directly from an .xlsx file from a Polhemius system
|
||||||
|
- Fixed an issue where the dropdowns in the Viewer windows would immediately open and close when using a trackpad
|
||||||
|
- glover and spm hrf models now function as intended without crashing. Currently, group analysis is still only supported by fir. Fixes [Issue 8](https://git.research.dezeeuw.ca/tyler/flares/issues/8)
|
||||||
|
- Clicking 'Clear' should now properly clear all data. Fixes [Issue 9](https://git.research.dezeeuw.ca/tyler/flares/issues/9)
|
||||||
|
- Revamped the fold channels viewer to not hang the application and to better process multiple participants at once. Fixes [Issue 34](https://git.research.dezeeuw.ca/tyler/flares/issues/34), [Issue 31](https://git.research.dezeeuw.ca/tyler/flares/issues/31)
|
||||||
|
- Added a Preferences menu to the navigation bar
|
||||||
|
- Two preferences have been added allowing to bypass the warning of 2D data detected and save files being from previous, potentially breaking versions
|
||||||
|
- Fixed a typo when saving a CSV that stated a SNIRF was being saved
|
||||||
|
- Loading a save file now properly restores AGE, GENDER, and GROUP. Fixes [Issue 40](https://git.research.dezeeuw.ca/tyler/flares/issues/40)
|
||||||
|
- Saving a project now no longer makes the main window go not responding. Fixes [Issue 43](https://git.research.dezeeuw.ca/tyler/flares/issues/43)
|
||||||
|
- Memory usage should no longer grow when generating lots of images multiple times. Fixes [Issue 36](https://git.research.dezeeuw.ca/tyler/flares/issues/36)
|
||||||
|
- Added a new option in the Analysis window for Functional Connectivity
|
||||||
|
- Functional connectivity is still in development and the results should currently be taken with a grain of salt
|
||||||
|
- A warning is displayed when entering the Functional Connectivity Viewer disclosing this
|
||||||
|
|
||||||
|
|
||||||
# Version 1.1.7
|
# Version 1.1.7
|
||||||
@@ -105,7 +141,7 @@
|
|||||||
- Added a group option when clicking on a participant's file
|
- Added a group option when clicking on a participant's file
|
||||||
- If no group is specified, the participant will be added to the "Default" group
|
- If no group is specified, the participant will be added to the "Default" group
|
||||||
- Added option to update the optode positions in a snirf file from the Options menu (F6)
|
- Added option to update the optode positions in a snirf file from the Options menu (F6)
|
||||||
- Fixed [Issue 3](https://git.research.dezeeuw.ca/tyler/flares/issues/3), [Issue 4](https://git.research.dezeeuw.ca/tyler/flares/issues/4), [Issue 17](https://git.research.dezeeuw.ca/tyler/flares/issues/17), [Issue 21](https://git.research.dezeeuw.ca/tyler/flares/issues/21), [Issue 22](https://git.research.dezeeuw.ca/tyler/flares/issues/22)
|
- Fixed [Issue 3](https://git.research.dezeeuw.ca/tyler/flares/issues/3), [Issue 5](https://git.research.dezeeuw.ca/tyler/flares/issues/5), [Issue 17](https://git.research.dezeeuw.ca/tyler/flares/issues/17), [Issue 21](https://git.research.dezeeuw.ca/tyler/flares/issues/21), [Issue 22](https://git.research.dezeeuw.ca/tyler/flares/issues/22)
|
||||||
|
|
||||||
|
|
||||||
# Version 1.0.1
|
# Version 1.0.1
|
||||||
|
|||||||
483
flares.py
483
flares.py
@@ -47,9 +47,9 @@ from nilearn.glm.regression import OLSModel
|
|||||||
import statsmodels.formula.api as smf # type: ignore
|
import statsmodels.formula.api as smf # type: ignore
|
||||||
from statsmodels.stats.multitest import multipletests
|
from statsmodels.stats.multitest import multipletests
|
||||||
|
|
||||||
from scipy import stats
|
|
||||||
from scipy.spatial.distance import cdist
|
from scipy.spatial.distance import cdist
|
||||||
from scipy.signal import welch, butter, filtfilt # type: ignore
|
from scipy.signal import welch, butter, filtfilt # type: ignore
|
||||||
|
from scipy.stats import pearsonr, zscore, t
|
||||||
|
|
||||||
import pywt # type: ignore
|
import pywt # type: ignore
|
||||||
import neurokit2 as nk # type: ignore
|
import neurokit2 as nk # type: ignore
|
||||||
@@ -58,6 +58,7 @@ import neurokit2 as nk # type: ignore
|
|||||||
import pyvistaqt # type: ignore
|
import pyvistaqt # type: ignore
|
||||||
import vtkmodules.util.data_model
|
import vtkmodules.util.data_model
|
||||||
import vtkmodules.util.execution_model
|
import vtkmodules.util.execution_model
|
||||||
|
import xlrd
|
||||||
|
|
||||||
# External library imports for mne
|
# External library imports for mne
|
||||||
from mne import (
|
from mne import (
|
||||||
@@ -90,6 +91,9 @@ from mne_nirs.io.fold import fold_channel_specificity # type: ignore
|
|||||||
from mne_nirs.preprocessing import peak_power # type: ignore
|
from mne_nirs.preprocessing import peak_power # type: ignore
|
||||||
from mne_nirs.statistics._glm_level_first import RegressionResults # type: ignore
|
from mne_nirs.statistics._glm_level_first import RegressionResults # type: ignore
|
||||||
|
|
||||||
|
from mne_connectivity.viz import plot_connectivity_circle
|
||||||
|
from mne_connectivity import envelope_correlation, spectral_connectivity_epochs, spectral_connectivity_time
|
||||||
|
|
||||||
# Needs to be set for mne
|
# Needs to be set for mne
|
||||||
os.environ["SUBJECTS_DIR"] = str(data_path()) + "/subjects" # type: ignore
|
os.environ["SUBJECTS_DIR"] = str(data_path()) + "/subjects" # type: ignore
|
||||||
|
|
||||||
@@ -123,8 +127,6 @@ SECONDS_TO_KEEP: float
|
|||||||
OPTODE_PLACEMENT: bool
|
OPTODE_PLACEMENT: bool
|
||||||
SHOW_OPTODE_NAMES: bool
|
SHOW_OPTODE_NAMES: bool
|
||||||
|
|
||||||
HEART_RATE: bool
|
|
||||||
|
|
||||||
SHORT_CHANNEL: bool
|
SHORT_CHANNEL: bool
|
||||||
SHORT_CHANNEL_THRESH: float
|
SHORT_CHANNEL_THRESH: float
|
||||||
LONG_CHANNEL_THRESH: float
|
LONG_CHANNEL_THRESH: float
|
||||||
@@ -189,9 +191,9 @@ TIME_WINDOW_END: int
|
|||||||
MAX_WORKERS: int
|
MAX_WORKERS: int
|
||||||
VERBOSITY: bool
|
VERBOSITY: bool
|
||||||
|
|
||||||
AGE = 25 # Assume 25 if not set from the GUI. This will result in a reasonable PPF
|
AGE: int = 25 # Assume 25 if not set from the GUI. This will result in a reasonable PPF
|
||||||
GENDER = ""
|
GENDER: str = ""
|
||||||
GROUP = "Default"
|
GROUP: str = "Default"
|
||||||
|
|
||||||
# These are parameters that are required for the analysis
|
# These are parameters that are required for the analysis
|
||||||
REQUIRED_KEYS: dict[str, Any] = {
|
REQUIRED_KEYS: dict[str, Any] = {
|
||||||
@@ -928,11 +930,12 @@ def interpolate_fNIRS_bads_weighted_average(raw, max_dist=0.03, min_neighbors=2)
|
|||||||
raw.info['bads'] = [ch for ch in raw.info['bads'] if ch not in bad_ch_to_remove]
|
raw.info['bads'] = [ch for ch in raw.info['bads'] if ch not in bad_ch_to_remove]
|
||||||
|
|
||||||
print("\nInterpolation complete.\n")
|
print("\nInterpolation complete.\n")
|
||||||
|
print("Bads cleared:", raw.info['bads'])
|
||||||
|
raw.info['bads'] = []
|
||||||
|
|
||||||
for ch in raw.info['bads']:
|
for ch in raw.info['bads']:
|
||||||
print(f"Channel {ch} still marked as bad.")
|
print(f"Channel {ch} still marked as bad.")
|
||||||
|
|
||||||
print("Bads cleared:", raw.info['bads'])
|
|
||||||
fig_raw_after = raw.plot(duration=raw.times[-1], n_channels=raw.info['nchan'], title="After interpolation", show=False)
|
fig_raw_after = raw.plot(duration=raw.times[-1], n_channels=raw.info['nchan'], title="After interpolation", show=False)
|
||||||
|
|
||||||
return raw, fig_raw_after
|
return raw, fig_raw_after
|
||||||
@@ -1333,7 +1336,7 @@ def make_design_matrix(raw_haemo, short_chans):
|
|||||||
drift_model=DRIFT_MODEL,
|
drift_model=DRIFT_MODEL,
|
||||||
high_pass=HIGH_PASS,
|
high_pass=HIGH_PASS,
|
||||||
drift_order=DRIFT_ORDER,
|
drift_order=DRIFT_ORDER,
|
||||||
fir_delays=range(15),
|
fir_delays=FIR_DELAYS,
|
||||||
add_regs=short_chans.get_data().T,
|
add_regs=short_chans.get_data().T,
|
||||||
add_reg_names=short_chans.ch_names,
|
add_reg_names=short_chans.ch_names,
|
||||||
min_onset=MIN_ONSET,
|
min_onset=MIN_ONSET,
|
||||||
@@ -1347,7 +1350,7 @@ def make_design_matrix(raw_haemo, short_chans):
|
|||||||
drift_model=DRIFT_MODEL,
|
drift_model=DRIFT_MODEL,
|
||||||
high_pass=HIGH_PASS,
|
high_pass=HIGH_PASS,
|
||||||
drift_order=DRIFT_ORDER,
|
drift_order=DRIFT_ORDER,
|
||||||
fir_delays=range(15),
|
fir_delays=FIR_DELAYS,
|
||||||
min_onset=MIN_ONSET,
|
min_onset=MIN_ONSET,
|
||||||
oversampling=OVERSAMPLING
|
oversampling=OVERSAMPLING
|
||||||
)
|
)
|
||||||
@@ -1659,8 +1662,11 @@ def fold_channels(raw: BaseRaw) -> None:
|
|||||||
landmark_color_map = {landmark: colors[i % len(colors)] for i, landmark in enumerate(landmarks)}
|
landmark_color_map = {landmark: colors[i % len(colors)] for i, landmark in enumerate(landmarks)}
|
||||||
|
|
||||||
# Iterate over each channel
|
# Iterate over each channel
|
||||||
|
print(len(hbo_channel_names))
|
||||||
|
|
||||||
for idx, channel_name in enumerate(hbo_channel_names):
|
for idx, channel_name in enumerate(hbo_channel_names):
|
||||||
|
|
||||||
|
print(idx, channel_name)
|
||||||
# Run the fOLD on the selected channel
|
# Run the fOLD on the selected channel
|
||||||
channel_data = raw.copy().pick(picks=channel_name) # type: ignore
|
channel_data = raw.copy().pick(picks=channel_name) # type: ignore
|
||||||
|
|
||||||
@@ -1703,6 +1709,7 @@ def fold_channels(raw: BaseRaw) -> None:
|
|||||||
|
|
||||||
landmark_specificity_data = []
|
landmark_specificity_data = []
|
||||||
|
|
||||||
|
|
||||||
# TODO: Fix this
|
# TODO: Fix this
|
||||||
if True:
|
if True:
|
||||||
handles = [
|
handles = [
|
||||||
@@ -1725,8 +1732,9 @@ def fold_channels(raw: BaseRaw) -> None:
|
|||||||
for ax in axes[len(hbo_channel_names):]:
|
for ax in axes[len(hbo_channel_names):]:
|
||||||
ax.axis('off')
|
ax.axis('off')
|
||||||
|
|
||||||
plt.show()
|
#plt.show()
|
||||||
return fig, legend_fig
|
fig_dict = {"main": fig, "legend": legend_fig}
|
||||||
|
return convert_fig_dict_to_png_bytes(fig_dict)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@@ -2246,9 +2254,15 @@ def brain_3d_visualization(raw_haemo, df_cha, selected_event, t_or_theta: Litera
|
|||||||
# Get all activity conditions
|
# Get all activity conditions
|
||||||
for cond in [f'{selected_event}']:
|
for cond in [f'{selected_event}']:
|
||||||
|
|
||||||
if True:
|
|
||||||
ch_summary = df_cha.query(f"Condition.str.startswith('{cond}_delay_') and Chroma == 'hbo'", engine='python') # type: ignore
|
ch_summary = df_cha.query(f"Condition.str.startswith('{cond}_delay_') and Chroma == 'hbo'", engine='python') # type: ignore
|
||||||
|
|
||||||
|
print(ch_summary)
|
||||||
|
|
||||||
|
if ch_summary.empty:
|
||||||
|
#not fir model
|
||||||
|
print("No data found for this condition.")
|
||||||
|
ch_summary = df_cha.query(f"Condition in [@cond] and Chroma == 'hbo'", engine='python')
|
||||||
|
|
||||||
# Use ordinary least squares (OLS) if only one participant
|
# Use ordinary least squares (OLS) if only one participant
|
||||||
# TODO: Fix.
|
# TODO: Fix.
|
||||||
if True:
|
if True:
|
||||||
@@ -2269,6 +2283,9 @@ def brain_3d_visualization(raw_haemo, df_cha, selected_event, t_or_theta: Litera
|
|||||||
valid_channels = ch_summary["ch_name"].unique().tolist() # type: ignore
|
valid_channels = ch_summary["ch_name"].unique().tolist() # type: ignore
|
||||||
raw_for_plot = raw_haemo.copy().pick(picks=valid_channels) # type: ignore
|
raw_for_plot = raw_haemo.copy().pick(picks=valid_channels) # type: ignore
|
||||||
|
|
||||||
|
print(f"DEBUG: Model DF rows: {len(model_df)}")
|
||||||
|
print(f"DEBUG: Raw channels: {len(raw_for_plot.ch_names)}")
|
||||||
|
|
||||||
brain = plot_3d_evoked_array(raw_for_plot.pick(picks="hbo"), model_df, view="dorsal", distance=0.02, colorbar=True, clim=clim, mode="weighted", size=(800, 700)) # type: ignore
|
brain = plot_3d_evoked_array(raw_for_plot.pick(picks="hbo"), model_df, view="dorsal", distance=0.02, colorbar=True, clim=clim, mode="weighted", size=(800, 700)) # type: ignore
|
||||||
|
|
||||||
if show_optodes == 'all' or show_optodes == 'sensors':
|
if show_optodes == 'all' or show_optodes == 'sensors':
|
||||||
@@ -2804,7 +2821,7 @@ def run_second_level_analysis(df_contrasts, raw, p, bounds):
|
|||||||
result = model.fit(Y)
|
result = model.fit(Y)
|
||||||
|
|
||||||
t_val = result.t(0).item()
|
t_val = result.t(0).item()
|
||||||
p_val = 2 * stats.t.sf(np.abs(t_val), df=result.df_model)
|
p_val = 2 * t.sf(np.abs(t_val), df=result.df_model)
|
||||||
mean_beta = np.mean(Y)
|
mean_beta = np.mean(Y)
|
||||||
|
|
||||||
group_results.append({
|
group_results.append({
|
||||||
@@ -3299,7 +3316,7 @@ def hr_calc(raw):
|
|||||||
# --- Parameters for PSD ---
|
# --- Parameters for PSD ---
|
||||||
desired_bin_hz = 0.1
|
desired_bin_hz = 0.1
|
||||||
nperseg = int(sfreq / desired_bin_hz)
|
nperseg = int(sfreq / desired_bin_hz)
|
||||||
hr_range = (30, 180)
|
hr_range = (30, 180) # TODO: SHould this not use the user defined values?
|
||||||
|
|
||||||
# --- Function to find strongest local peak ---
|
# --- Function to find strongest local peak ---
|
||||||
def find_hr_from_psd(ch_data):
|
def find_hr_from_psd(ch_data):
|
||||||
@@ -3343,6 +3360,7 @@ def process_participant(file_path, progress_callback=None):
|
|||||||
logger.info("Step 1 Completed.")
|
logger.info("Step 1 Completed.")
|
||||||
|
|
||||||
# Step 2: Trimming
|
# Step 2: Trimming
|
||||||
|
# TODO: Clean this into a method
|
||||||
if TRIM:
|
if TRIM:
|
||||||
if hasattr(raw, 'annotations') and len(raw.annotations) > 0:
|
if hasattr(raw, 'annotations') and len(raw.annotations) > 0:
|
||||||
# Get time of first event
|
# Get time of first event
|
||||||
@@ -3385,7 +3403,7 @@ def process_participant(file_path, progress_callback=None):
|
|||||||
fig_individual["short"] = fig_short_chans
|
fig_individual["short"] = fig_short_chans
|
||||||
else:
|
else:
|
||||||
short_chans = None
|
short_chans = None
|
||||||
get_long_channels(raw, min_dist=SHORT_CHANNEL_THRESH, max_dist=LONG_CHANNEL_THRESH) # Don't update the existing raw
|
raw = get_long_channels(raw, min_dist=0, max_dist=LONG_CHANNEL_THRESH) # keep both short channels and all channels up to the threshold length
|
||||||
if progress_callback: progress_callback(4)
|
if progress_callback: progress_callback(4)
|
||||||
logger.info("Step 4 Completed.")
|
logger.info("Step 4 Completed.")
|
||||||
|
|
||||||
@@ -3527,6 +3545,7 @@ def process_participant(file_path, progress_callback=None):
|
|||||||
logger.info("19")
|
logger.info("19")
|
||||||
|
|
||||||
# Step 20: Generate GLM Results
|
# Step 20: Generate GLM Results
|
||||||
|
if "derivative" not in HRF_MODEL.lower():
|
||||||
fig_glm_result = plot_glm_results(file_path, raw_haemo, glm_est, design_matrix)
|
fig_glm_result = plot_glm_results(file_path, raw_haemo, glm_est, design_matrix)
|
||||||
for name, fig in fig_glm_result:
|
for name, fig in fig_glm_result:
|
||||||
fig_individual[f"GLM {name}"] = fig
|
fig_individual[f"GLM {name}"] = fig
|
||||||
@@ -3534,6 +3553,7 @@ def process_participant(file_path, progress_callback=None):
|
|||||||
logger.info("20")
|
logger.info("20")
|
||||||
|
|
||||||
# Step 21: Generate Channel Significance
|
# Step 21: Generate Channel Significance
|
||||||
|
if HRF_MODEL == "fir":
|
||||||
fig_significance = individual_significance(raw_haemo, glm_est)
|
fig_significance = individual_significance(raw_haemo, glm_est)
|
||||||
for name, fig in fig_significance:
|
for name, fig in fig_significance:
|
||||||
fig_individual[f"Significance {name}"] = fig
|
fig_individual[f"Significance {name}"] = fig
|
||||||
@@ -3568,6 +3588,7 @@ def process_participant(file_path, progress_callback=None):
|
|||||||
[(column, contrast_matrix[i]) for i, column in enumerate(design_matrix.columns)]
|
[(column, contrast_matrix[i]) for i, column in enumerate(design_matrix.columns)]
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if HRF_MODEL == "fir":
|
||||||
all_delay_cols = [col for col in design_matrix.columns if "_delay_" in col]
|
all_delay_cols = [col for col in design_matrix.columns if "_delay_" in col]
|
||||||
all_conditions = sorted({col.split("_delay_")[0] for col in all_delay_cols})
|
all_conditions = sorted({col.split("_delay_")[0] for col in all_delay_cols})
|
||||||
|
|
||||||
@@ -3599,6 +3620,8 @@ def process_participant(file_path, progress_callback=None):
|
|||||||
# Step 23: Compute Contrast Results
|
# Step 23: Compute Contrast Results
|
||||||
contrast_results = {}
|
contrast_results = {}
|
||||||
|
|
||||||
|
if HRF_MODEL == "fir":
|
||||||
|
|
||||||
for cond, contrast_vector in contrast_dict.items():
|
for cond, contrast_vector in contrast_dict.items():
|
||||||
contrast = glm_est.compute_contrast(contrast_vector) # type: ignore
|
contrast = glm_est.compute_contrast(contrast_vector) # type: ignore
|
||||||
df = contrast.to_dataframe()
|
df = contrast.to_dataframe()
|
||||||
@@ -3618,7 +3641,17 @@ def process_participant(file_path, progress_callback=None):
|
|||||||
if progress_callback: progress_callback(25)
|
if progress_callback: progress_callback(25)
|
||||||
logger.info("25")
|
logger.info("25")
|
||||||
|
|
||||||
return raw_haemo, epochs, fig_bytes, cha, contrast_results, df_ind, design_matrix, AGE, GENDER, GROUP, True
|
# TODO: Tidy up
|
||||||
|
# Extract the parameters this file was ran with. No need to return age, gender, group?
|
||||||
|
config = {
|
||||||
|
k: globals()[k]
|
||||||
|
for k in __annotations__
|
||||||
|
if k in globals() and k != "REQUIRED_KEYS"
|
||||||
|
}
|
||||||
|
|
||||||
|
print(config)
|
||||||
|
|
||||||
|
return raw_haemo, config, epochs, fig_bytes, cha, contrast_results, df_ind, design_matrix, True
|
||||||
|
|
||||||
|
|
||||||
def sanitize_paths_for_pickle(raw_haemo, epochs):
|
def sanitize_paths_for_pickle(raw_haemo, epochs):
|
||||||
@@ -3629,3 +3662,421 @@ def sanitize_paths_for_pickle(raw_haemo, epochs):
|
|||||||
# Fix epochs._raw._filenames
|
# Fix epochs._raw._filenames
|
||||||
if hasattr(epochs, '_raw') and hasattr(epochs._raw, '_filenames'):
|
if hasattr(epochs, '_raw') and hasattr(epochs._raw, '_filenames'):
|
||||||
epochs._raw._filenames = [str(p) for p in epochs._raw._filenames]
|
epochs._raw._filenames = [str(p) for p in epochs._raw._filenames]
|
||||||
|
|
||||||
|
|
||||||
|
def functional_connectivity_spectral_epochs(epochs, n_lines, vmin):
|
||||||
|
|
||||||
|
# will crash without this load
|
||||||
|
epochs.load_data()
|
||||||
|
hbo_epochs = epochs.copy().pick(picks="hbo")
|
||||||
|
data = hbo_epochs.get_data()
|
||||||
|
names = hbo_epochs.ch_names
|
||||||
|
sfreq = hbo_epochs.info["sfreq"]
|
||||||
|
con = spectral_connectivity_epochs(
|
||||||
|
data,
|
||||||
|
method=["coh", "plv"],
|
||||||
|
mode="multitaper",
|
||||||
|
sfreq=sfreq,
|
||||||
|
fmin=0.04,
|
||||||
|
fmax=0.2,
|
||||||
|
faverage=True,
|
||||||
|
verbose=True
|
||||||
|
)
|
||||||
|
|
||||||
|
con_coh, con_plv = con
|
||||||
|
|
||||||
|
coh = con_coh.get_data(output="dense").squeeze()
|
||||||
|
plv = con_plv.get_data(output="dense").squeeze()
|
||||||
|
|
||||||
|
np.fill_diagonal(coh, 0)
|
||||||
|
np.fill_diagonal(plv, 0)
|
||||||
|
|
||||||
|
plot_connectivity_circle(
|
||||||
|
coh,
|
||||||
|
names,
|
||||||
|
title="fNIRS Functional Connectivity (HbO - Coherence)",
|
||||||
|
n_lines=n_lines,
|
||||||
|
vmin=vmin
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
def functional_connectivity_spectral_time(epochs, n_lines, vmin):
|
||||||
|
|
||||||
|
# will crash without this load
|
||||||
|
epochs.load_data()
|
||||||
|
hbo_epochs = epochs.copy().pick(picks="hbo")
|
||||||
|
data = hbo_epochs.get_data()
|
||||||
|
names = hbo_epochs.ch_names
|
||||||
|
sfreq = hbo_epochs.info["sfreq"]
|
||||||
|
|
||||||
|
freqs = np.linspace(0.04, 0.2, 10)
|
||||||
|
n_cycles = freqs * 2
|
||||||
|
|
||||||
|
con = spectral_connectivity_time(
|
||||||
|
data,
|
||||||
|
freqs=freqs,
|
||||||
|
method=["coh", "plv"],
|
||||||
|
mode="multitaper",
|
||||||
|
sfreq=sfreq,
|
||||||
|
fmin=0.04,
|
||||||
|
fmax=0.2,
|
||||||
|
n_cycles=n_cycles,
|
||||||
|
faverage=True,
|
||||||
|
verbose=True
|
||||||
|
)
|
||||||
|
|
||||||
|
con_coh, con_plv = con
|
||||||
|
|
||||||
|
coh = con_coh.get_data(output="dense").squeeze()
|
||||||
|
plv = con_plv.get_data(output="dense").squeeze()
|
||||||
|
|
||||||
|
np.fill_diagonal(coh, 0)
|
||||||
|
np.fill_diagonal(plv, 0)
|
||||||
|
|
||||||
|
plot_connectivity_circle(
|
||||||
|
coh,
|
||||||
|
names,
|
||||||
|
title="fNIRS Functional Connectivity (HbO - Coherence)",
|
||||||
|
n_lines=n_lines,
|
||||||
|
vmin=vmin
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
def functional_connectivity_envelope(epochs, n_lines, vmin):
|
||||||
|
# will crash without this load
|
||||||
|
|
||||||
|
epochs.load_data()
|
||||||
|
hbo_epochs = epochs.copy().pick(picks="hbo")
|
||||||
|
data = hbo_epochs.get_data()
|
||||||
|
|
||||||
|
|
||||||
|
env = envelope_correlation(
|
||||||
|
data,
|
||||||
|
orthogonalize=False,
|
||||||
|
absolute=True
|
||||||
|
)
|
||||||
|
env_data = env.get_data(output="dense")
|
||||||
|
|
||||||
|
env_corr = env_data.mean(axis=0)
|
||||||
|
|
||||||
|
env_corr = np.squeeze(env_corr)
|
||||||
|
|
||||||
|
np.fill_diagonal(env_corr, 0)
|
||||||
|
|
||||||
|
plot_connectivity_circle(
|
||||||
|
env_corr,
|
||||||
|
hbo_epochs.ch_names,
|
||||||
|
title="fNIRS HbO Envelope Correlation (Task Connectivity)",
|
||||||
|
n_lines=n_lines,
|
||||||
|
vmin=vmin
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def functional_connectivity_betas(raw_hbo, n_lines, vmin, event_name=None):
|
||||||
|
|
||||||
|
raw_hbo = raw_hbo.copy().pick(picks="hbo")
|
||||||
|
onsets = raw_hbo.annotations.onset
|
||||||
|
|
||||||
|
# CRITICAL: Update the Raw object's annotations so the GLM sees unique events
|
||||||
|
ann = raw_hbo.annotations
|
||||||
|
new_desc = []
|
||||||
|
|
||||||
|
for i, desc in enumerate(ann.description):
|
||||||
|
new_desc.append(f"{desc}__trial_{i:03d}")
|
||||||
|
|
||||||
|
ann.description = np.array(new_desc)
|
||||||
|
|
||||||
|
|
||||||
|
# shoudl use user defiuned!!!!
|
||||||
|
design_matrix = make_first_level_design_matrix(
|
||||||
|
raw=raw_hbo,
|
||||||
|
hrf_model='fir',
|
||||||
|
fir_delays=np.arange(0, 12, 1),
|
||||||
|
drift_model='cosine',
|
||||||
|
drift_order=1
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# 3. Run GLM & Extract Betas
|
||||||
|
glm_results = run_glm(raw_hbo, design_matrix)
|
||||||
|
betas = np.array(glm_results.theta())
|
||||||
|
reg_names = list(design_matrix.columns)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
n_channels = betas.shape[0]
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------
|
||||||
|
# 5. Find unique trial tags (optionally filtered by event)
|
||||||
|
# ------------------------------------------------------------------
|
||||||
|
trial_tags = sorted({
|
||||||
|
col.split("_delay")[0]
|
||||||
|
for col in reg_names
|
||||||
|
if (
|
||||||
|
("__trial_" in col)
|
||||||
|
and (event_name is None or col.startswith(event_name + "__"))
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
if len(trial_tags) == 0:
|
||||||
|
raise ValueError(f"No trials found for event_name={event_name}")
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------
|
||||||
|
# 6. Build beta series (average across FIR delays per trial)
|
||||||
|
# ------------------------------------------------------------------
|
||||||
|
beta_series = np.zeros((n_channels, len(trial_tags)))
|
||||||
|
|
||||||
|
for t, tag in enumerate(trial_tags):
|
||||||
|
idx = [
|
||||||
|
i for i, col in enumerate(reg_names)
|
||||||
|
if col.startswith(f"{tag}_delay")
|
||||||
|
]
|
||||||
|
beta_series[:, t] = np.mean(betas[:, idx], axis=1).flatten()
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
# n_channels, n_trials = betas.shape[0], len(onsets)
|
||||||
|
# beta_series = np.zeros((n_channels, n_trials))
|
||||||
|
|
||||||
|
# for t in range(n_trials):
|
||||||
|
# trial_indices = [i for i, col in enumerate(reg_names) if col.startswith(f"trial_{t:03d}_delay")]
|
||||||
|
# if trial_indices:
|
||||||
|
# beta_series[:, t] = np.mean(betas[:, trial_indices], axis=1).flatten()
|
||||||
|
|
||||||
|
# Normalize each channel so they are on the same scale
|
||||||
|
# Without this, everything is connected to everything. Apparently this is a big issue in fNIRS?
|
||||||
|
beta_series = zscore(beta_series, axis=1)
|
||||||
|
|
||||||
|
global_signal = np.mean(beta_series, axis=0)
|
||||||
|
beta_series_clean = np.zeros_like(beta_series)
|
||||||
|
for i in range(n_channels):
|
||||||
|
slope, _ = np.polyfit(global_signal, beta_series[i, :], 1)
|
||||||
|
beta_series_clean[i, :] = beta_series[i, :] - (slope * global_signal)
|
||||||
|
|
||||||
|
# 4. Correlation & Strict Filtering
|
||||||
|
corr_matrix = np.zeros((n_channels, n_channels))
|
||||||
|
p_matrix = np.ones((n_channels, n_channels))
|
||||||
|
|
||||||
|
for i in range(n_channels):
|
||||||
|
for j in range(i + 1, n_channels):
|
||||||
|
r, p = pearsonr(beta_series_clean[i, :], beta_series_clean[j, :])
|
||||||
|
corr_matrix[i, j] = corr_matrix[j, i] = r
|
||||||
|
p_matrix[i, j] = p_matrix[j, i] = p
|
||||||
|
|
||||||
|
# 5. High-Bar Thresholding
|
||||||
|
reject, _ = multipletests(p_matrix[np.triu_indices(n_channels, k=1)], method='fdr_bh', alpha=0.05)[:2]
|
||||||
|
sig_corr_matrix = np.zeros_like(corr_matrix)
|
||||||
|
triu = np.triu_indices(n_channels, k=1)
|
||||||
|
|
||||||
|
for idx, is_sig in enumerate(reject):
|
||||||
|
r_val = corr_matrix[triu[0][idx], triu[1][idx]]
|
||||||
|
# Only keep the absolute strongest connections
|
||||||
|
if is_sig and abs(r_val) > 0.7:
|
||||||
|
sig_corr_matrix[triu[0][idx], triu[1][idx]] = r_val
|
||||||
|
sig_corr_matrix[triu[1][idx], triu[0][idx]] = r_val
|
||||||
|
|
||||||
|
# 6. Plot
|
||||||
|
plot_connectivity_circle(
|
||||||
|
sig_corr_matrix,
|
||||||
|
raw_hbo.ch_names,
|
||||||
|
title="Strictly Filtered Connectivity (TDDR + GSR + Z-Score)",
|
||||||
|
n_lines=None,
|
||||||
|
vmin=0.7,
|
||||||
|
vmax=1.0,
|
||||||
|
colormap='hot' # Use 'hot' to make positive connections pop
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
def get_single_subject_beta_corr(raw_hbo, event_name=None, config=None):
|
||||||
|
"""Processes one participant and returns their correlation matrix."""
|
||||||
|
raw_hbo = raw_hbo.copy().pick(picks="hbo")
|
||||||
|
ann = raw_hbo.annotations
|
||||||
|
|
||||||
|
# Rename for trial-level GLM
|
||||||
|
new_desc = [f"{desc}__trial_{i:03d}" for i, desc in enumerate(ann.description)]
|
||||||
|
ann.description = np.array(new_desc)
|
||||||
|
|
||||||
|
if config == None:
|
||||||
|
print("no config")
|
||||||
|
design_matrix = make_first_level_design_matrix(
|
||||||
|
raw=raw_hbo, hrf_model='fir',
|
||||||
|
fir_delays=np.arange(0, 12, 1),
|
||||||
|
drift_model='cosine', drift_order=1
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
print("config")
|
||||||
|
if config.get("SHORT_CHANNEL_REGRESSION") == True:
|
||||||
|
short_chans = get_short_channels(raw_hbo, max_dist=config.get("SHORT_CHANNEL_THRESH"))
|
||||||
|
|
||||||
|
design_matrix = make_first_level_design_matrix(
|
||||||
|
raw=raw_hbo,
|
||||||
|
stim_dur=config.get("STIM_DUR"),
|
||||||
|
hrf_model=config.get("HRF_MODEL"),
|
||||||
|
drift_model=config.get("DRIFT_MODEL"),
|
||||||
|
high_pass=config.get("HIGH_PASS"),
|
||||||
|
drift_order=config.get("DRIFT_ORDER"),
|
||||||
|
fir_delays=config.get("FIR_DELAYS"),
|
||||||
|
add_regs=short_chans.get_data().T,
|
||||||
|
add_reg_names=short_chans.ch_names,
|
||||||
|
min_onset=config.get("MIN_ONSET"),
|
||||||
|
oversampling=config.get("OVERSAMPLING")
|
||||||
|
)
|
||||||
|
print("yep")
|
||||||
|
else:
|
||||||
|
design_matrix = make_first_level_design_matrix(
|
||||||
|
raw=raw_hbo,
|
||||||
|
stim_dur=config.get("STIM_DUR"),
|
||||||
|
hrf_model=config.get("HRF_MODEL"),
|
||||||
|
drift_model=config.get("DRIFT_MODEL"),
|
||||||
|
high_pass=config.get("HIGH_PASS"),
|
||||||
|
drift_order=config.get("DRIFT_ORDER"),
|
||||||
|
fir_delays=config.get("FIR_DELAYS"),
|
||||||
|
min_onset=config.get("MIN_ONSET"),
|
||||||
|
oversampling=config.get("OVERSAMPLING")
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
glm_results = run_glm(raw_hbo, design_matrix)
|
||||||
|
betas = np.array(glm_results.theta())
|
||||||
|
reg_names = list(design_matrix.columns)
|
||||||
|
n_channels = betas.shape[0]
|
||||||
|
|
||||||
|
# Filter trials by event name
|
||||||
|
trial_tags = sorted({
|
||||||
|
col.split("_delay")[0] for col in reg_names
|
||||||
|
if "__trial_" in col and (event_name is None or col.startswith(event_name + "__"))
|
||||||
|
})
|
||||||
|
|
||||||
|
if not trial_tags:
|
||||||
|
return None, None
|
||||||
|
|
||||||
|
# Build Beta Series
|
||||||
|
beta_series = np.zeros((n_channels, len(trial_tags)))
|
||||||
|
for t, tag in enumerate(trial_tags):
|
||||||
|
idx = [i for i, col in enumerate(reg_names) if col.startswith(f"{tag}_delay")]
|
||||||
|
beta_series[:, t] = np.mean(betas[:, idx], axis=1).flatten()
|
||||||
|
#beta_series[:, t] = np.max(betas[:, idx], axis=1).flatten() #TODO: Figure out which one to use
|
||||||
|
|
||||||
|
# Z-score and GSR (Global Signal Regression)
|
||||||
|
beta_series = zscore(beta_series, axis=1)
|
||||||
|
global_signal = np.mean(beta_series, axis=0)
|
||||||
|
for i in range(n_channels):
|
||||||
|
slope, _ = np.polyfit(global_signal, beta_series[i, :], 1)
|
||||||
|
beta_series[i, :] -= (slope * global_signal)
|
||||||
|
|
||||||
|
# Correlation Matrix
|
||||||
|
corr_matrix = np.corrcoef(beta_series)
|
||||||
|
return corr_matrix, raw_hbo.ch_names
|
||||||
|
|
||||||
|
|
||||||
|
def run_group_functional_connectivity(haemo_dict, config_dict, selected_paths, event_name, n_lines, vmin):
|
||||||
|
"""Aggregates multiple participants and triggers the plot."""
|
||||||
|
all_z_matrices = []
|
||||||
|
common_names = None
|
||||||
|
|
||||||
|
for path in selected_paths:
|
||||||
|
raw = haemo_dict.get(path)
|
||||||
|
config = config_dict.get(path)
|
||||||
|
if raw is None: continue
|
||||||
|
print(config)
|
||||||
|
|
||||||
|
corr, names = get_single_subject_beta_corr(raw, event_name, config)
|
||||||
|
|
||||||
|
if corr is not None:
|
||||||
|
# Fisher Z-transform for averaging
|
||||||
|
z_mat = np.arctanh(np.clip(corr, -0.99, 0.99))
|
||||||
|
all_z_matrices.append(z_mat)
|
||||||
|
common_names = names
|
||||||
|
|
||||||
|
from scipy.stats import ttest_1samp
|
||||||
|
# 1. Convert list to 3D array: (Participants, Channels, Channels)
|
||||||
|
group_z_data = np.array(all_z_matrices)
|
||||||
|
|
||||||
|
print("1")
|
||||||
|
# 2. Perform a T-Test across the participant dimension (axis 0)
|
||||||
|
# We test if the mean Z-score is different from 0
|
||||||
|
# C:\Users\tyler\Desktop\research\.venv\Lib\site-packages\scipy\stats\_axis_nan_policy.py:611: RuntimeWarning: Precision loss occurred in moment calculation due to catastrophic cancellation. This occurs when the data are nearly identical. Results may be unreliable.
|
||||||
|
# res = hypotest_fun_out(*samples, axis=axis, **kwds)
|
||||||
|
|
||||||
|
print("--- Variance Check ---")
|
||||||
|
|
||||||
|
# ADD THIS LINE: Define n_channels based on the data shape
|
||||||
|
# group_z_data.shape is (n_participants, n_channels, n_channels)
|
||||||
|
n_channels = group_z_data.shape[1]
|
||||||
|
|
||||||
|
variance_matrix = np.var(group_z_data, axis=0)
|
||||||
|
|
||||||
|
# Find where variance is exactly 0 (or very close to it)
|
||||||
|
zero_var_indices = np.where(variance_matrix < 1e-15)
|
||||||
|
coords = list(zip(zero_var_indices[0], zero_var_indices[1]))
|
||||||
|
|
||||||
|
diag_count = 0
|
||||||
|
non_diag_pairs = []
|
||||||
|
|
||||||
|
for r, c in coords:
|
||||||
|
if r == c:
|
||||||
|
diag_count += 1
|
||||||
|
else:
|
||||||
|
non_diag_pairs.append((r, c))
|
||||||
|
|
||||||
|
print(f"Total pairs with zero variance: {len(coords)}")
|
||||||
|
print(f"Identical diagonals: {diag_count}/{n_channels}")
|
||||||
|
|
||||||
|
if non_diag_pairs:
|
||||||
|
print(f"Warning: {len(non_diag_pairs)} non-diagonal pairs have zero variance!")
|
||||||
|
for r, c in non_diag_pairs[:10]: # Print first 10
|
||||||
|
print(f" - Pair: Channel {r} & Channel {c}")
|
||||||
|
else:
|
||||||
|
print("Clean! Zero variance only exists on the diagonals.")
|
||||||
|
print("----------------------")
|
||||||
|
|
||||||
|
t_stats, p_values = ttest_1samp(group_z_data, popmean=0, axis=0)
|
||||||
|
print("2")
|
||||||
|
|
||||||
|
# 3. Multiple Comparisons Correction (FDR)
|
||||||
|
# We only care about the upper triangle (unique connections)
|
||||||
|
n_channels = p_values.shape[0]
|
||||||
|
triu_indices = np.triu_indices(n_channels, k=1)
|
||||||
|
flat_p = p_values[triu_indices]
|
||||||
|
|
||||||
|
reject, corrected_p = multipletests(flat_p, method='fdr_bh', alpha=0.05)[:2]
|
||||||
|
|
||||||
|
# 4. Create the final "Significant" Matrix
|
||||||
|
avg_r = np.tanh(np.mean(group_z_data, axis=0))
|
||||||
|
sig_avg_r = np.zeros_like(avg_r)
|
||||||
|
|
||||||
|
# Only keep connections that are Significant AND above your VMIN (r-threshold)
|
||||||
|
for idx, is_sig in enumerate(reject):
|
||||||
|
row, col = triu_indices[0][idx], triu_indices[1][idx]
|
||||||
|
r_val = avg_r[row, col]
|
||||||
|
|
||||||
|
if is_sig and abs(r_val) >= vmin:
|
||||||
|
sig_avg_r[row, col] = sig_avg_r[col, row] = r_val
|
||||||
|
|
||||||
|
# 5. Plot the significant results
|
||||||
|
|
||||||
|
|
||||||
|
# if not all_z_matrices:
|
||||||
|
# return
|
||||||
|
|
||||||
|
# # Average and convert back to R
|
||||||
|
# avg_z = np.mean(all_z_matrices, axis=0)
|
||||||
|
# avg_r = np.tanh(avg_z)
|
||||||
|
|
||||||
|
# # Thresholding
|
||||||
|
# avg_r[np.abs(avg_r) < vmin] = 0
|
||||||
|
|
||||||
|
plot_connectivity_circle(
|
||||||
|
sig_avg_r, common_names, n_lines=n_lines,
|
||||||
|
title=f"Group Connectivity: {event_name if event_name else 'All Events'}",
|
||||||
|
vmin=vmin, vmax=1.0, colormap='hot'
|
||||||
|
)
|
||||||
BIN
requirements.txt
Normal file
BIN
requirements.txt
Normal file
Binary file not shown.
Reference in New Issue
Block a user